Feb 20 06:46:24 crc systemd[1]: Starting Kubernetes Kubelet... Feb 20 06:46:24 crc restorecon[4813]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 20 06:46:24 crc restorecon[4813]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 20 06:46:25 crc kubenswrapper[5094]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.556730 5094 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569005 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569061 5094 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569070 5094 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569078 5094 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569086 5094 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569095 5094 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569103 5094 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569116 5094 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569128 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569136 5094 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569147 5094 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569160 5094 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569169 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569177 5094 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569186 5094 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569195 5094 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569205 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569214 5094 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569223 5094 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569231 5094 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569240 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569248 5094 feature_gate.go:330] unrecognized feature gate: Example Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569257 5094 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569265 5094 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569274 5094 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569281 5094 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569291 5094 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569299 5094 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569306 5094 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569314 5094 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569322 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569338 5094 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569345 5094 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569356 5094 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569366 5094 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569374 5094 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569382 5094 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569391 5094 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569402 5094 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569411 5094 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569420 5094 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569428 5094 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569436 5094 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569443 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569451 5094 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569458 5094 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569466 5094 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569474 5094 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569481 5094 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569489 5094 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569497 5094 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569505 5094 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569512 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569520 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569529 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569537 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569545 5094 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569553 5094 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569562 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569570 5094 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569579 5094 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569587 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569595 5094 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569602 5094 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569612 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569620 5094 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569628 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569637 5094 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569645 5094 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569652 5094 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.569660 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569895 5094 flags.go:64] FLAG: --address="0.0.0.0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569917 5094 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569936 5094 flags.go:64] FLAG: --anonymous-auth="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569949 5094 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569962 5094 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569972 5094 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569985 5094 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.569997 5094 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570007 5094 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570016 5094 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570027 5094 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570036 5094 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570046 5094 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570055 5094 flags.go:64] FLAG: --cgroup-root="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570066 5094 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570075 5094 flags.go:64] FLAG: --client-ca-file="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570084 5094 flags.go:64] FLAG: --cloud-config="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570093 5094 flags.go:64] FLAG: --cloud-provider="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570102 5094 flags.go:64] FLAG: --cluster-dns="[]" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570113 5094 flags.go:64] FLAG: --cluster-domain="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570125 5094 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570135 5094 flags.go:64] FLAG: --config-dir="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570144 5094 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570154 5094 flags.go:64] FLAG: --container-log-max-files="5" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570165 5094 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570175 5094 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570184 5094 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570194 5094 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570204 5094 flags.go:64] FLAG: --contention-profiling="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570214 5094 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570223 5094 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570233 5094 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570242 5094 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570256 5094 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570266 5094 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570276 5094 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570285 5094 flags.go:64] FLAG: --enable-load-reader="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570296 5094 flags.go:64] FLAG: --enable-server="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570305 5094 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570319 5094 flags.go:64] FLAG: --event-burst="100" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570329 5094 flags.go:64] FLAG: --event-qps="50" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570338 5094 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570348 5094 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570357 5094 flags.go:64] FLAG: --eviction-hard="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570378 5094 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570387 5094 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570397 5094 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570408 5094 flags.go:64] FLAG: --eviction-soft="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570417 5094 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570426 5094 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570435 5094 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570445 5094 flags.go:64] FLAG: --experimental-mounter-path="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570454 5094 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570463 5094 flags.go:64] FLAG: --fail-swap-on="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570473 5094 flags.go:64] FLAG: --feature-gates="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570485 5094 flags.go:64] FLAG: --file-check-frequency="20s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570494 5094 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570505 5094 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570515 5094 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570525 5094 flags.go:64] FLAG: --healthz-port="10248" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570534 5094 flags.go:64] FLAG: --help="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570544 5094 flags.go:64] FLAG: --hostname-override="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570553 5094 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570563 5094 flags.go:64] FLAG: --http-check-frequency="20s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570572 5094 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570582 5094 flags.go:64] FLAG: --image-credential-provider-config="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570591 5094 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570601 5094 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570610 5094 flags.go:64] FLAG: --image-service-endpoint="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570620 5094 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570629 5094 flags.go:64] FLAG: --kube-api-burst="100" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570639 5094 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570649 5094 flags.go:64] FLAG: --kube-api-qps="50" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570658 5094 flags.go:64] FLAG: --kube-reserved="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570667 5094 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570677 5094 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570686 5094 flags.go:64] FLAG: --kubelet-cgroups="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570695 5094 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570735 5094 flags.go:64] FLAG: --lock-file="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570744 5094 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570754 5094 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570764 5094 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570779 5094 flags.go:64] FLAG: --log-json-split-stream="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570789 5094 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570799 5094 flags.go:64] FLAG: --log-text-split-stream="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570809 5094 flags.go:64] FLAG: --logging-format="text" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570818 5094 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570828 5094 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570837 5094 flags.go:64] FLAG: --manifest-url="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570847 5094 flags.go:64] FLAG: --manifest-url-header="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570859 5094 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570869 5094 flags.go:64] FLAG: --max-open-files="1000000" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570880 5094 flags.go:64] FLAG: --max-pods="110" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570889 5094 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570899 5094 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570908 5094 flags.go:64] FLAG: --memory-manager-policy="None" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570918 5094 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570927 5094 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570937 5094 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570947 5094 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570970 5094 flags.go:64] FLAG: --node-status-max-images="50" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570982 5094 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.570994 5094 flags.go:64] FLAG: --oom-score-adj="-999" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571006 5094 flags.go:64] FLAG: --pod-cidr="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571017 5094 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571035 5094 flags.go:64] FLAG: --pod-manifest-path="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571046 5094 flags.go:64] FLAG: --pod-max-pids="-1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571057 5094 flags.go:64] FLAG: --pods-per-core="0" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571068 5094 flags.go:64] FLAG: --port="10250" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571080 5094 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571091 5094 flags.go:64] FLAG: --provider-id="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571103 5094 flags.go:64] FLAG: --qos-reserved="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571115 5094 flags.go:64] FLAG: --read-only-port="10255" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571126 5094 flags.go:64] FLAG: --register-node="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571138 5094 flags.go:64] FLAG: --register-schedulable="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571149 5094 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571181 5094 flags.go:64] FLAG: --registry-burst="10" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571192 5094 flags.go:64] FLAG: --registry-qps="5" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571204 5094 flags.go:64] FLAG: --reserved-cpus="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571216 5094 flags.go:64] FLAG: --reserved-memory="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571228 5094 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571239 5094 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571248 5094 flags.go:64] FLAG: --rotate-certificates="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571258 5094 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571267 5094 flags.go:64] FLAG: --runonce="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571276 5094 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571286 5094 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571295 5094 flags.go:64] FLAG: --seccomp-default="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571304 5094 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571314 5094 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571324 5094 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571334 5094 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571344 5094 flags.go:64] FLAG: --storage-driver-password="root" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571353 5094 flags.go:64] FLAG: --storage-driver-secure="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571362 5094 flags.go:64] FLAG: --storage-driver-table="stats" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571371 5094 flags.go:64] FLAG: --storage-driver-user="root" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571380 5094 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571390 5094 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571400 5094 flags.go:64] FLAG: --system-cgroups="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571409 5094 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571424 5094 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571433 5094 flags.go:64] FLAG: --tls-cert-file="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571443 5094 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571455 5094 flags.go:64] FLAG: --tls-min-version="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571464 5094 flags.go:64] FLAG: --tls-private-key-file="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571477 5094 flags.go:64] FLAG: --topology-manager-policy="none" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571488 5094 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571501 5094 flags.go:64] FLAG: --topology-manager-scope="container" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571514 5094 flags.go:64] FLAG: --v="2" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571531 5094 flags.go:64] FLAG: --version="false" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571546 5094 flags.go:64] FLAG: --vmodule="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571557 5094 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.571569 5094 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571896 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571912 5094 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571925 5094 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571936 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571945 5094 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571955 5094 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571963 5094 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571971 5094 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571979 5094 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571989 5094 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.571998 5094 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572006 5094 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572014 5094 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572022 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572031 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572039 5094 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572047 5094 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572055 5094 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572064 5094 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572072 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572080 5094 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572088 5094 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572096 5094 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572106 5094 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572117 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572126 5094 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572135 5094 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572143 5094 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572152 5094 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572160 5094 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572168 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572176 5094 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572184 5094 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572192 5094 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572200 5094 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572208 5094 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572216 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572224 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572238 5094 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572247 5094 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572255 5094 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572264 5094 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572272 5094 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572280 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572289 5094 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572296 5094 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572308 5094 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572317 5094 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572325 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572334 5094 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572367 5094 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572376 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572385 5094 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572393 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572401 5094 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572411 5094 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572420 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572428 5094 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572436 5094 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572444 5094 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572452 5094 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572460 5094 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572469 5094 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572477 5094 feature_gate.go:330] unrecognized feature gate: Example Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572485 5094 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572493 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572500 5094 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572508 5094 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572516 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572524 5094 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.572532 5094 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.572559 5094 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.588280 5094 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.588365 5094 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588523 5094 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588546 5094 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588556 5094 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588568 5094 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588578 5094 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588589 5094 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588598 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588607 5094 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588615 5094 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588623 5094 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588632 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588640 5094 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588648 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588656 5094 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588665 5094 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588675 5094 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588685 5094 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588694 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588726 5094 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588735 5094 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588743 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588751 5094 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588759 5094 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588767 5094 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588775 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588783 5094 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588791 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588800 5094 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588808 5094 feature_gate.go:330] unrecognized feature gate: Example Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588816 5094 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588824 5094 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588832 5094 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588840 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588849 5094 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588859 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588868 5094 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588875 5094 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588884 5094 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588892 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588901 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588909 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588917 5094 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588926 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588933 5094 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588944 5094 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588954 5094 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588965 5094 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588976 5094 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588985 5094 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.588994 5094 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589002 5094 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589010 5094 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589020 5094 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589030 5094 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589039 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589047 5094 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589055 5094 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589064 5094 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589073 5094 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589082 5094 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589090 5094 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589098 5094 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589106 5094 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589115 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589123 5094 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589131 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589139 5094 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589147 5094 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589155 5094 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589162 5094 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589180 5094 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.589195 5094 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589439 5094 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589451 5094 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589461 5094 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589470 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589480 5094 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589488 5094 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589497 5094 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589505 5094 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589513 5094 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589521 5094 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589529 5094 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589540 5094 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589551 5094 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589560 5094 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589569 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589578 5094 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589588 5094 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589598 5094 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589607 5094 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589616 5094 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589624 5094 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589632 5094 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589641 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589649 5094 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589658 5094 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589666 5094 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589674 5094 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589682 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589690 5094 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589698 5094 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589727 5094 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589736 5094 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589747 5094 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589757 5094 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589766 5094 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589774 5094 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589782 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589791 5094 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589799 5094 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589807 5094 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589815 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589826 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589834 5094 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589843 5094 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589854 5094 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589862 5094 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589870 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589879 5094 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589888 5094 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589896 5094 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589903 5094 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589912 5094 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589920 5094 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589928 5094 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589936 5094 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589944 5094 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589954 5094 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589965 5094 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589975 5094 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589983 5094 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.589993 5094 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590002 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590011 5094 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590019 5094 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590028 5094 feature_gate.go:330] unrecognized feature gate: Example Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590037 5094 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590044 5094 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590052 5094 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590061 5094 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590070 5094 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.590078 5094 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.590092 5094 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.590385 5094 server.go:940] "Client rotation is on, will bootstrap in background" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.599771 5094 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.599972 5094 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.602835 5094 server.go:997] "Starting client certificate rotation" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.602887 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.603139 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-13 08:14:35.525513352 +0000 UTC Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.603287 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.635142 5094 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.637854 5094 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.642097 5094 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.661758 5094 log.go:25] "Validated CRI v1 runtime API" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.700447 5094 log.go:25] "Validated CRI v1 image API" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.704471 5094 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.712472 5094 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-20-06-37-34-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.712531 5094 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.740287 5094 manager.go:217] Machine: {Timestamp:2026-02-20 06:46:25.73521531 +0000 UTC m=+0.607842051 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d25915f7-4d55-43a4-a20b-9e6118746152 BootID:6fb44c16-1595-44a7-b2ec-4faee6098a1e Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b0:f2:bf Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b0:f2:bf Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b0:bd:ac Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:2f:8b:03 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:bf:be:a5 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:31:3f:de Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:77:97:f3 Speed:-1 Mtu:1496} {Name:ens7.44 MacAddress:52:54:00:bb:56:b8 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e6:75:03:50:85:7e Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:5c:c4:8d:a6:0b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.740696 5094 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.740999 5094 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.742849 5094 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743094 5094 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743148 5094 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743420 5094 topology_manager.go:138] "Creating topology manager with none policy" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743436 5094 container_manager_linux.go:303] "Creating device plugin manager" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743936 5094 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.743982 5094 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.744342 5094 state_mem.go:36] "Initialized new in-memory state store" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.744450 5094 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748339 5094 kubelet.go:418] "Attempting to sync node with API server" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748394 5094 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748435 5094 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748461 5094 kubelet.go:324] "Adding apiserver pod source" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.748482 5094 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.754111 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.754158 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.754467 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.754329 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.757048 5094 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.758801 5094 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.764785 5094 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766690 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766743 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766755 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766765 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766781 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766793 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766802 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766815 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766825 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766835 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766882 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.766893 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.767833 5094 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.768621 5094 server.go:1280] "Started kubelet" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.770108 5094 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.770409 5094 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.771357 5094 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 20 06:46:25 crc systemd[1]: Started Kubernetes Kubelet. Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.774606 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.775344 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.775380 5094 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.777062 5094 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.777111 5094 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.777387 5094 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.777474 5094 server.go:460] "Adding debug handlers to kubelet server" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.777960 5094 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778038 5094 factory.go:55] Registering systemd factory Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778101 5094 factory.go:221] Registration of the systemd container factory successfully Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778220 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 21:17:15.292909303 +0000 UTC Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778369 5094 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.778617 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.778949 5094 factory.go:153] Registering CRI-O factory Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.779004 5094 factory.go:221] Registration of the crio container factory successfully Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.780095 5094 factory.go:103] Registering Raw factory Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.780528 5094 manager.go:1196] Started watching for new ooms in manager Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.782891 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.783024 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.783678 5094 manager.go:319] Starting recovery of all containers Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.782864 5094 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1895e184107aaeb4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 06:46:25.768566452 +0000 UTC m=+0.641193163,LastTimestamp:2026-02-20 06:46:25.768566452 +0000 UTC m=+0.641193163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790303 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790364 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790381 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790396 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790408 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790422 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790435 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790449 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790465 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790478 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790491 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790503 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790518 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790536 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790579 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790592 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790606 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790619 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790634 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790678 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790690 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790722 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790735 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790751 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790767 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790807 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790825 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.790861 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793268 5094 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793364 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793400 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793426 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793453 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793478 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793501 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793527 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793550 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793574 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793599 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793622 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793645 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793683 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793940 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.793994 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794025 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794055 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794092 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794118 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794147 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794174 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794202 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794229 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794261 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794306 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794342 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794375 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794412 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794444 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794497 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794526 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794556 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794585 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794614 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794643 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794785 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794809 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794834 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794857 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794878 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794899 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794923 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794945 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794967 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.794992 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795014 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795035 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795057 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795079 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795100 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795120 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795141 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795164 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795188 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795207 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795226 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795248 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795269 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795291 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795314 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795337 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795360 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795379 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795400 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795421 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795443 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795465 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795483 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795504 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795525 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795544 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795567 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795586 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795605 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795624 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795643 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795675 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795725 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795747 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795766 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795788 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795821 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795841 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795863 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795887 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795971 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.795999 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796021 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796042 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796062 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796082 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796102 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796123 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796144 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796163 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796184 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796206 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796225 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796273 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796293 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796314 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796333 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796357 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796380 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796402 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796525 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796554 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796585 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796608 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796632 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796655 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796676 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796731 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796765 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796786 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796807 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796827 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796846 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796866 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796884 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796907 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796927 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796946 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796964 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.796985 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797011 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797030 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797050 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797119 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797139 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797157 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797175 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797192 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797212 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797231 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797253 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797272 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797290 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797311 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797330 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797349 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797374 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797453 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797474 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797496 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797518 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797540 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797560 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797583 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797614 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797634 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797655 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797684 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797741 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797763 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797783 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797807 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797830 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797851 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797871 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797891 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797910 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797931 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797952 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797971 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.797995 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798023 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798047 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798069 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798089 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798108 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798128 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798148 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798174 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798196 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798215 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798233 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798253 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798270 5094 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798288 5094 reconstruct.go:97] "Volume reconstruction finished" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.798302 5094 reconciler.go:26] "Reconciler: start to sync state" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.815436 5094 manager.go:324] Recovery completed Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.832901 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.834917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.834975 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.834992 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.836058 5094 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.836078 5094 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.836102 5094 state_mem.go:36] "Initialized new in-memory state store" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.836398 5094 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.838762 5094 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.838837 5094 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.838893 5094 kubelet.go:2335] "Starting kubelet main sync loop" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.839098 5094 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 20 06:46:25 crc kubenswrapper[5094]: W0220 06:46:25.841983 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.842073 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.859954 5094 policy_none.go:49] "None policy: Start" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.861912 5094 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.861971 5094 state_mem.go:35] "Initializing new in-memory state store" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.877781 5094 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.929369 5094 manager.go:334] "Starting Device Plugin manager" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.929596 5094 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.929615 5094 server.go:79] "Starting device plugin registration server" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.930216 5094 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.930231 5094 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.931306 5094 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.931405 5094 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.931415 5094 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.939937 5094 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.940062 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.942044 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.942168 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.942270 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.942535 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.943313 5094 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943527 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943629 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943684 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943780 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.943801 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.944016 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.944644 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.944781 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.944995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945030 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945047 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945502 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945551 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.945700 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946081 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946154 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946299 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946428 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946870 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.946969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947048 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947168 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947200 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947214 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947421 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947517 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.947605 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948120 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948298 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948337 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.948984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.949004 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.949345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.949479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:25 crc kubenswrapper[5094]: I0220 06:46:25.949559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:25 crc kubenswrapper[5094]: E0220 06:46:25.979932 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002100 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002165 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002196 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002220 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002244 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002267 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002349 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002408 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002439 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002466 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002504 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.002532 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.003084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.003165 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.003799 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.030631 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.032887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.033038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.033061 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.033150 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.034435 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.106988 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107066 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107123 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107159 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107199 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107238 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107276 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107310 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107313 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107417 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107453 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107491 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107497 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107544 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107547 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107603 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107646 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107654 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107662 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107742 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107789 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107799 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107850 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107887 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107919 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.107995 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.234692 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.236888 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.236958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.236980 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.237021 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.237602 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.283541 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.311488 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.320915 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.340193 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-523e79a3f190a5488045a232713f38143137dabed606bdd6616494ddd29a17cf WatchSource:0}: Error finding container 523e79a3f190a5488045a232713f38143137dabed606bdd6616494ddd29a17cf: Status 404 returned error can't find the container with id 523e79a3f190a5488045a232713f38143137dabed606bdd6616494ddd29a17cf Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.345855 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.351752 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.355975 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-43fc53989740edf3c4bf5ea5184e4fef79420685b6e4142b9cf631e2d0db3483 WatchSource:0}: Error finding container 43fc53989740edf3c4bf5ea5184e4fef79420685b6e4142b9cf631e2d0db3483: Status 404 returned error can't find the container with id 43fc53989740edf3c4bf5ea5184e4fef79420685b6e4142b9cf631e2d0db3483 Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.377298 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-610094ff93a3d58da948bed349747975cdcde485d4b5f8735fc90bccd47d3ee7 WatchSource:0}: Error finding container 610094ff93a3d58da948bed349747975cdcde485d4b5f8735fc90bccd47d3ee7: Status 404 returned error can't find the container with id 610094ff93a3d58da948bed349747975cdcde485d4b5f8735fc90bccd47d3ee7 Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.379445 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-66e4ae54e42cdf1afc4d8b94e8e8952546fda0d67e459fd74a28a697710d260d WatchSource:0}: Error finding container 66e4ae54e42cdf1afc4d8b94e8e8952546fda0d67e459fd74a28a697710d260d: Status 404 returned error can't find the container with id 66e4ae54e42cdf1afc4d8b94e8e8952546fda0d67e459fd74a28a697710d260d Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.381153 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.381333 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-2fd246824bfd27eb11ec034ef0d12243d315c9a3a9f91b2d701b7db3d80bae0e WatchSource:0}: Error finding container 2fd246824bfd27eb11ec034ef0d12243d315c9a3a9f91b2d701b7db3d80bae0e: Status 404 returned error can't find the container with id 2fd246824bfd27eb11ec034ef0d12243d315c9a3a9f91b2d701b7db3d80bae0e Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.638108 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.639671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.639732 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.639747 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.639776 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.640420 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.775960 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.779154 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 01:20:50.937149723 +0000 UTC Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.845189 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"43fc53989740edf3c4bf5ea5184e4fef79420685b6e4142b9cf631e2d0db3483"} Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.846916 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"523e79a3f190a5488045a232713f38143137dabed606bdd6616494ddd29a17cf"} Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.848356 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2fd246824bfd27eb11ec034ef0d12243d315c9a3a9f91b2d701b7db3d80bae0e"} Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.849893 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"66e4ae54e42cdf1afc4d8b94e8e8952546fda0d67e459fd74a28a697710d260d"} Feb 20 06:46:26 crc kubenswrapper[5094]: I0220 06:46:26.851541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"610094ff93a3d58da948bed349747975cdcde485d4b5f8735fc90bccd47d3ee7"} Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.935166 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.935313 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.936485 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.936586 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:26 crc kubenswrapper[5094]: W0220 06:46:26.950423 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:26 crc kubenswrapper[5094]: E0220 06:46:26.950543 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:27 crc kubenswrapper[5094]: W0220 06:46:27.048475 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:27 crc kubenswrapper[5094]: E0220 06:46:27.048607 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:27 crc kubenswrapper[5094]: E0220 06:46:27.182432 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.441175 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.443570 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.443642 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.443661 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.443732 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:27 crc kubenswrapper[5094]: E0220 06:46:27.444760 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.665903 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 06:46:27 crc kubenswrapper[5094]: E0220 06:46:27.667410 5094 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.776373 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.779886 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 03:52:46.588087905 +0000 UTC Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.857261 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf" exitCode=0 Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.857341 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.857414 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.859423 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.859485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.859510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.861902 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.861968 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.861989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.863731 5094 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3" exitCode=0 Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.863835 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.863917 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.865613 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.865683 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.865740 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.866278 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326" exitCode=0 Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.866373 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.866464 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.867760 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.867800 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.867819 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.869205 5094 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155" exitCode=0 Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.869269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155"} Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.869340 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.870623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.870665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.870680 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.871147 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.872362 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.872411 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:27 crc kubenswrapper[5094]: I0220 06:46:27.872429 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.775991 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.780365 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 13:58:58.971928842 +0000 UTC Feb 20 06:46:28 crc kubenswrapper[5094]: E0220 06:46:28.784498 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.876879 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.876942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.876959 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.879120 5094 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378" exitCode=0 Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.879241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.879354 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.880837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.880906 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.880933 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.883385 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.883463 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.884569 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.884634 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.884655 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.887289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.887406 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.889205 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.889257 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.889277 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.891811 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.891838 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.891850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986"} Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.891969 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.893121 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.893177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:28 crc kubenswrapper[5094]: I0220 06:46:28.893189 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.045801 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.047909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.047987 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.048014 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.048056 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:29 crc kubenswrapper[5094]: E0220 06:46:29.048878 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.188:6443: connect: connection refused" node="crc" Feb 20 06:46:29 crc kubenswrapper[5094]: W0220 06:46:29.056886 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.188:6443: connect: connection refused Feb 20 06:46:29 crc kubenswrapper[5094]: E0220 06:46:29.056973 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.188:6443: connect: connection refused" logger="UnhandledError" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.780586 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 19:43:03.416637411 +0000 UTC Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.901176 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1"} Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.901324 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.901326 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3"} Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.902911 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.902959 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.902979 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.905579 5094 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c" exitCode=0 Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.905828 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c"} Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.905949 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.906021 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.906021 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.906985 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.907287 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908298 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908314 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908369 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908635 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908597 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:29 crc kubenswrapper[5094]: I0220 06:46:29.908654 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.781884 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 16:29:34.544350262 +0000 UTC Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914659 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f"} Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914761 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b"} Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa"} Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914812 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.914930 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.916682 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.916809 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:30 crc kubenswrapper[5094]: I0220 06:46:30.916850 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.186274 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.493105 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.782940 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 13:58:26.935767669 +0000 UTC Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.892580 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.910938 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.926401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568"} Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.926488 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1"} Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.926653 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.926651 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.928795 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.928866 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.928887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.929255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.929350 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:31 crc kubenswrapper[5094]: I0220 06:46:31.929372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.249839 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.251923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.252004 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.252026 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.252073 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.783933 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 12:51:20.221342636 +0000 UTC Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.930184 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.930248 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.931760 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.931849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.931869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.932666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.932813 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:32 crc kubenswrapper[5094]: I0220 06:46:32.932842 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:33 crc kubenswrapper[5094]: I0220 06:46:33.784509 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 15:52:43.183778197 +0000 UTC Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.079895 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.080152 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.082406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.082461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.082480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:34 crc kubenswrapper[5094]: I0220 06:46:34.785648 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 00:03:44.662012926 +0000 UTC Feb 20 06:46:35 crc kubenswrapper[5094]: I0220 06:46:35.786612 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 19:08:11.579236036 +0000 UTC Feb 20 06:46:35 crc kubenswrapper[5094]: E0220 06:46:35.945003 5094 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.084344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.084807 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.087110 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.087234 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.087266 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.784907 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.785164 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.786895 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 03:10:57.889639584 +0000 UTC Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.786916 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.787040 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.787066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.793441 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.795840 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.796255 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.797927 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.798012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.798033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.943359 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.943598 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.944947 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.945122 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:36 crc kubenswrapper[5094]: I0220 06:46:36.945241 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.787277 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 04:36:52.626427321 +0000 UTC Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.794825 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.891538 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.891930 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.893969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.894159 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.894316 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.946158 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.948129 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.948188 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.948211 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:37 crc kubenswrapper[5094]: I0220 06:46:37.953498 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.788450 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 23:41:55.406672566 +0000 UTC Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.949199 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.950886 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.950956 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:38 crc kubenswrapper[5094]: I0220 06:46:38.950980 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:39 crc kubenswrapper[5094]: W0220 06:46:39.440847 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.441030 5094 trace.go:236] Trace[801421912]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 06:46:29.438) (total time: 10002ms): Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[801421912]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (06:46:39.440) Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[801421912]: [10.002187394s] [10.002187394s] END Feb 20 06:46:39 crc kubenswrapper[5094]: E0220 06:46:39.441082 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 20 06:46:39 crc kubenswrapper[5094]: W0220 06:46:39.484416 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.484564 5094 trace.go:236] Trace[1063127187]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 06:46:29.483) (total time: 10001ms): Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[1063127187]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:46:39.484) Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[1063127187]: [10.001351453s] [10.001351453s] END Feb 20 06:46:39 crc kubenswrapper[5094]: E0220 06:46:39.484603 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.777345 5094 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.788851 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 02:59:04.323079362 +0000 UTC Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.952088 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.953384 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.953431 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.953447 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:39 crc kubenswrapper[5094]: W0220 06:46:39.961004 5094 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 20 06:46:39 crc kubenswrapper[5094]: I0220 06:46:39.961102 5094 trace.go:236] Trace[893958956]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 06:46:29.959) (total time: 10001ms): Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[893958956]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:46:39.960) Feb 20 06:46:39 crc kubenswrapper[5094]: Trace[893958956]: [10.001877736s] [10.001877736s] END Feb 20 06:46:39 crc kubenswrapper[5094]: E0220 06:46:39.961126 5094 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.592992 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.593085 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.603485 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.603591 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.789129 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 14:14:01.78015178 +0000 UTC Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.795472 5094 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 20 06:46:40 crc kubenswrapper[5094]: I0220 06:46:40.795608 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 20 06:46:41 crc kubenswrapper[5094]: I0220 06:46:41.503583 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]log ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]etcd ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/openshift.io-api-request-count-filter ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/openshift.io-startkubeinformers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-apiserver-admission-initializer ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/generic-apiserver-start-informers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/priority-and-fairness-config-consumer ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/priority-and-fairness-filter ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/storage-object-count-tracker-hook ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-apiextensions-informers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-apiextensions-controllers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/crd-informer-synced ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-system-namespaces-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-cluster-authentication-info-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-legacy-token-tracking-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-service-ip-repair-controllers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Feb 20 06:46:41 crc kubenswrapper[5094]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/priority-and-fairness-config-producer ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/bootstrap-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/start-kube-aggregator-informers ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-status-local-available-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-status-remote-available-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-registration-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-wait-for-first-sync ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-discovery-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/kube-apiserver-autoregistration ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]autoregister-completion ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-openapi-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: [+]poststarthook/apiservice-openapiv3-controller ok Feb 20 06:46:41 crc kubenswrapper[5094]: livez check failed Feb 20 06:46:41 crc kubenswrapper[5094]: I0220 06:46:41.503689 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:46:41 crc kubenswrapper[5094]: I0220 06:46:41.790080 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:52:23.064929019 +0000 UTC Feb 20 06:46:42 crc kubenswrapper[5094]: I0220 06:46:42.790189 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 17:28:42.066674577 +0000 UTC Feb 20 06:46:43 crc kubenswrapper[5094]: I0220 06:46:43.299303 5094 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 06:46:43 crc kubenswrapper[5094]: I0220 06:46:43.790868 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:22:01.432598002 +0000 UTC Feb 20 06:46:43 crc kubenswrapper[5094]: I0220 06:46:43.890541 5094 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 06:46:44 crc kubenswrapper[5094]: I0220 06:46:44.791659 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 13:58:01.484458188 +0000 UTC Feb 20 06:46:45 crc kubenswrapper[5094]: E0220 06:46:45.599474 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 20 06:46:45 crc kubenswrapper[5094]: E0220 06:46:45.604210 5094 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.604222 5094 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.604527 5094 trace.go:236] Trace[580338358]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Feb-2026 06:46:32.522) (total time: 13082ms): Feb 20 06:46:45 crc kubenswrapper[5094]: Trace[580338358]: ---"Objects listed" error: 13082ms (06:46:45.604) Feb 20 06:46:45 crc kubenswrapper[5094]: Trace[580338358]: [13.082283811s] [13.082283811s] END Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.604576 5094 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.617168 5094 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.643065 5094 csr.go:261] certificate signing request csr-7btk2 is approved, waiting to be issued Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.655652 5094 csr.go:257] certificate signing request csr-7btk2 is issued Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.728209 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54088->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.728278 5094 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54102->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.728303 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54088->192.168.126.11:17697: read: connection reset by peer" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.728361 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:54102->192.168.126.11:17697: read: connection reset by peer" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.791990 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 19:55:32.592918222 +0000 UTC Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.792171 5094 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.968544 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.970425 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1" exitCode=255 Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.970462 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1"} Feb 20 06:46:45 crc kubenswrapper[5094]: I0220 06:46:45.985634 5094 scope.go:117] "RemoveContainer" containerID="f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.504565 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.657781 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-20 06:41:45 +0000 UTC, rotation deadline is 2026-11-21 02:59:36.658512115 +0000 UTC Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.657843 5094 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6572h12m50.000672813s for next certificate rotation Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.775322 5094 apiserver.go:52] "Watching apiserver" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.780757 5094 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781110 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781678 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781783 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781576 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.781842 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.781942 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.782077 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.782194 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.782256 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.784691 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.784917 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.784957 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.786999 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.787123 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.787136 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.787371 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.787938 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.789749 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.792137 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 03:17:21.741923958 +0000 UTC Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.821339 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.844929 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.858128 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.879216 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.879863 5094 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.895117 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.911859 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.911907 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.911971 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.911995 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912015 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912032 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912053 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912072 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912091 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912109 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912130 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912149 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912169 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912187 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912206 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912226 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912261 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912289 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912308 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912328 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912348 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912366 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912385 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912401 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912419 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912397 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912437 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912455 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912475 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912495 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912512 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912534 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912551 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912636 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912656 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912737 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912756 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912777 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912837 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912856 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912877 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912897 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912924 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912968 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.912992 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913016 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913036 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913038 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913190 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913236 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913268 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913298 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913304 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913325 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913384 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913409 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913433 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913459 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913490 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913513 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913515 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913541 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913577 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914041 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914081 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914112 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914179 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916341 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916382 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916417 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916459 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916501 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916675 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916732 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916769 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916800 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916832 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916865 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916937 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913508 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.913872 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914123 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914179 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914188 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914334 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914502 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914788 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914839 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914862 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.914964 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915053 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915166 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915278 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915335 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915358 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915434 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915460 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915599 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915614 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915641 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.915671 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916107 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916120 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.916196 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917099 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917793 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917838 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917878 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917915 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917947 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917983 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918018 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918054 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918091 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918127 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918166 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918204 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918243 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918275 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918309 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918341 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918377 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918411 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918443 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918507 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918543 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918577 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918606 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918648 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918675 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918724 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918749 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918776 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918846 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918872 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918922 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918948 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918971 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918995 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919020 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919136 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919175 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919201 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919226 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919274 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919301 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919325 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919348 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919372 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919398 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919423 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919450 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919475 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919501 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919528 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919554 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919579 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919604 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919631 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919679 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919724 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919748 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919852 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919878 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919929 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919956 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919980 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920003 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920028 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920055 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920088 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920155 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920182 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920206 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920230 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920258 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920282 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920309 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920336 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920360 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920385 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920417 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920443 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920468 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920493 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920519 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920555 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920594 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920623 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920684 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920733 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920763 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920792 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.926925 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.926975 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927018 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927054 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927106 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927142 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927175 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927201 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927228 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927254 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927281 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927306 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927331 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927398 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927433 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927468 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927561 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927621 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927648 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927672 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927730 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929075 5094 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929726 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929849 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929941 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930080 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930111 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930137 5094 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930161 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930185 5094 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930214 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930239 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930263 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930290 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930317 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930340 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930362 5094 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930385 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930409 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930433 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930460 5094 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930483 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930504 5094 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930526 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930548 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930569 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930592 5094 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930613 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930634 5094 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930655 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930679 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930729 5094 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930753 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930774 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930796 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930818 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930841 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917336 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917429 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.917814 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918001 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918134 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.918750 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.919073 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920863 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.920888 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921118 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921332 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921429 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921765 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.921845 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922217 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922450 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.922677 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.923047 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.927476 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.928592 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929299 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929356 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.929466 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930185 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930206 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930329 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930651 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.930871 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.931220 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.932017 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.932943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933191 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933307 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933522 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933680 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933997 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.933999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.934827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.936552 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.936919 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.937246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.937428 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.937606 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.937777 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.938039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.938282 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.938533 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.938789 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.940881 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.941755 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.942279 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.942397 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.942731 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.942690 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943036 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943182 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943572 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943630 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.943920 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.944638 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.944762 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.946875 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.947033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.948378 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949165 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949284 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949594 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949654 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949614 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.949969 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950048 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950432 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950472 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950762 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950802 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.950891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951040 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951096 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951232 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951498 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951509 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951801 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951970 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.951971 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.952455 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.952475 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.953955 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.954246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.954244 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.954601 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.954654 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.956769 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.959520 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.959800 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.959814 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.960187 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.960399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.960505 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.960601 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.961017 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.961285 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.961383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.963166 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.963622 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.965157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.966682 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.966835 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.966931 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.967181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.967468 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.967498 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.967513 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.967579 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.467557136 +0000 UTC m=+22.340183847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.967682 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.967989 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.968215 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.968307 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.968405 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.968697 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.969461 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.969622 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.469581664 +0000 UTC m=+22.342208375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.969864 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.969915 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970579 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970712 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.970965 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.971161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.971978 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.972186 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975355 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975576 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975834 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.975855 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.976626 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.976846 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.976839 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.977024 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.977065 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.977240 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.977342 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.982592 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.982818 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.983312 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.983373 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.483355666 +0000 UTC m=+22.355982377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.983796 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: E0220 06:46:46.983841 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.483832808 +0000 UTC m=+22.356459519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.983850 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.984546 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.987871 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.987981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988062 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988365 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988668 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988799 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.988907 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.989315 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.989964 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.992004 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.992785 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.995868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.998087 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:46 crc kubenswrapper[5094]: I0220 06:46:46.999525 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.000423 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.001618 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.001646 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.001664 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.001739 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:47.501717369 +0000 UTC m=+22.374344080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.013499 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.019169 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.019887 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.020034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23"} Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.020876 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031869 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031929 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031972 5094 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031984 5094 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.031995 5094 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032005 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032015 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032026 5094 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032035 5094 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032045 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032055 5094 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032064 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032073 5094 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032081 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032089 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032098 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032109 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032119 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032129 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032140 5094 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032150 5094 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032160 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032169 5094 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032178 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032186 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032195 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032204 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032212 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032221 5094 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032229 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032259 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032267 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032276 5094 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032285 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032352 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032294 5094 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032408 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032457 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032483 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032499 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032514 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032529 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032543 5094 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032558 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032573 5094 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032588 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032605 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032621 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032636 5094 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032649 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032661 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032672 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032684 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032696 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032757 5094 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032771 5094 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032784 5094 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032797 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032810 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032822 5094 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032836 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032849 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032861 5094 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032873 5094 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032886 5094 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032897 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032911 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032923 5094 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032937 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032950 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032963 5094 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032978 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.032991 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033009 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033021 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033034 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033047 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033059 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033071 5094 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033083 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033096 5094 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033108 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033122 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033134 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033147 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033160 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033174 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033186 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033200 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033213 5094 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033227 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033239 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033252 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033263 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033275 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033288 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033299 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033311 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033323 5094 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033337 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033348 5094 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033359 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033370 5094 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033383 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033394 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033407 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033420 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033434 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033446 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033462 5094 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033477 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033491 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033503 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033515 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033528 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033540 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033554 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033568 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033583 5094 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033598 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033611 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033624 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033638 5094 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033652 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033664 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033677 5094 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033690 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033730 5094 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033746 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033759 5094 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033772 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033785 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033798 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033810 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033822 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033835 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033847 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033860 5094 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033874 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033887 5094 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033902 5094 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033914 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033929 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033943 5094 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033956 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033968 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033981 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.033993 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034006 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034018 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034038 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034050 5094 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034063 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034075 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034089 5094 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034101 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034113 5094 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034125 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034138 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034149 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034161 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034173 5094 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.034186 5094 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.037082 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.039343 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.040149 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.054013 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.054399 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.070646 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.087090 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.099458 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.104973 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.106039 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.112527 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.122785 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-8wch6"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.123117 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.123522 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-qzxk2"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.123904 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.124027 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.126518 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.127183 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.127444 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.127611 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.130512 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.130828 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.146109 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.146234 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.146269 5094 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.189053 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.214811 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.240844 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253116 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bc82500-7462-4daa-9eff-116399acb06a-host\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6p6h\" (UniqueName: \"kubernetes.io/projected/3bc82500-7462-4daa-9eff-116399acb06a-kube-api-access-s6p6h\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253191 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81c0f95-7b6e-4a44-8115-f517fc8f4052-hosts-file\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253211 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr5sr\" (UniqueName: \"kubernetes.io/projected/d81c0f95-7b6e-4a44-8115-f517fc8f4052-kube-api-access-mr5sr\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.253265 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3bc82500-7462-4daa-9eff-116399acb06a-serviceca\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.278861 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.305477 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.323598 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.338812 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354245 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3bc82500-7462-4daa-9eff-116399acb06a-serviceca\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354289 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bc82500-7462-4daa-9eff-116399acb06a-host\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354306 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81c0f95-7b6e-4a44-8115-f517fc8f4052-hosts-file\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354322 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr5sr\" (UniqueName: \"kubernetes.io/projected/d81c0f95-7b6e-4a44-8115-f517fc8f4052-kube-api-access-mr5sr\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.354339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6p6h\" (UniqueName: \"kubernetes.io/projected/3bc82500-7462-4daa-9eff-116399acb06a-kube-api-access-s6p6h\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.355564 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3bc82500-7462-4daa-9eff-116399acb06a-serviceca\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.355629 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bc82500-7462-4daa-9eff-116399acb06a-host\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.355680 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d81c0f95-7b6e-4a44-8115-f517fc8f4052-hosts-file\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.360901 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.373770 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6p6h\" (UniqueName: \"kubernetes.io/projected/3bc82500-7462-4daa-9eff-116399acb06a-kube-api-access-s6p6h\") pod \"node-ca-8wch6\" (UID: \"3bc82500-7462-4daa-9eff-116399acb06a\") " pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.377612 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.377724 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr5sr\" (UniqueName: \"kubernetes.io/projected/d81c0f95-7b6e-4a44-8115-f517fc8f4052-kube-api-access-mr5sr\") pod \"node-resolver-qzxk2\" (UID: \"d81c0f95-7b6e-4a44-8115-f517fc8f4052\") " pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.388722 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.399467 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.489012 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8wch6" Feb 20 06:46:47 crc kubenswrapper[5094]: W0220 06:46:47.501083 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bc82500_7462_4daa_9eff_116399acb06a.slice/crio-19ebcbfa6786c1ff4d74dff0e353d2abd0d6b245d46afa016754b5a228eff35e WatchSource:0}: Error finding container 19ebcbfa6786c1ff4d74dff0e353d2abd0d6b245d46afa016754b5a228eff35e: Status 404 returned error can't find the container with id 19ebcbfa6786c1ff4d74dff0e353d2abd0d6b245d46afa016754b5a228eff35e Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.529735 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qzxk2" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.554258 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-56ppq"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.554848 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556624 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556743 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556774 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.556831 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.556918 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.556968 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.55695381 +0000 UTC m=+23.429580531 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557036 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.557028302 +0000 UTC m=+23.429655023 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557126 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557145 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557161 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557192 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.557184245 +0000 UTC m=+23.429810966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557243 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557270 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.557262537 +0000 UTC m=+23.429889258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557320 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557333 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557343 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.557371 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:48.557363119 +0000 UTC m=+23.429989840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.558261 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.558618 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.558802 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.559022 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.562015 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.580164 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.595719 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.609160 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.619069 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.630177 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.640498 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.659016 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-rootfs\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.659096 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjnr\" (UniqueName: \"kubernetes.io/projected/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-kube-api-access-hzjnr\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.659129 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-proxy-tls\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.659145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.666403 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.678302 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.692475 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.703378 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.760169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-proxy-tls\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.760227 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.760254 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-rootfs\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.760287 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjnr\" (UniqueName: \"kubernetes.io/projected/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-kube-api-access-hzjnr\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.762912 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-rootfs\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.763360 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-mcd-auth-proxy-config\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.793079 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 03:45:26.439249222 +0000 UTC Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.805920 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.811235 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.820290 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.830249 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.839674 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.839843 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.843889 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.844570 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-proxy-tls\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.844871 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjnr\" (UniqueName: \"kubernetes.io/projected/0810cb2f-5b29-4c97-8b16-e1bb2d455a0d-kube-api-access-hzjnr\") pod \"machine-config-daemon-56ppq\" (UID: \"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\") " pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.845546 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.846665 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.847313 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.848394 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.848958 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.849546 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.852257 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.856755 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.859168 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.859970 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.860515 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.861410 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.861976 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.862549 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.864342 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.864918 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.866277 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.866745 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.867374 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.868500 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.869018 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.870030 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.870483 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.872633 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.873234 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.874081 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.874602 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.878988 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.879506 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.880193 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.880744 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.881280 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.891084 5094 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.891219 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.894278 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.894786 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.898908 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.907308 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.915108 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.915160 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.916313 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.917268 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.918529 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.919503 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.920225 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.924757 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.925444 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.926407 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.927053 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.931067 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.932023 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.933017 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.933510 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.937820 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.938436 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.939086 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.940076 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.940616 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.940833 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.952413 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.962433 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.962671 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-9vd4p"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.963378 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.969464 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: W0220 06:46:47.969924 5094 reflector.go:561] object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz": failed to list *v1.Secret: secrets "multus-ancillary-tools-dockercfg-vnmsz" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.969966 5094 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-vnmsz\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"multus-ancillary-tools-dockercfg-vnmsz\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 20 06:46:47 crc kubenswrapper[5094]: W0220 06:46:47.970039 5094 reflector.go:561] object-"openshift-multus"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.970051 5094 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 20 06:46:47 crc kubenswrapper[5094]: W0220 06:46:47.970093 5094 reflector.go:561] object-"openshift-multus"/"default-cni-sysctl-allowlist": failed to list *v1.ConfigMap: configmaps "default-cni-sysctl-allowlist" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Feb 20 06:46:47 crc kubenswrapper[5094]: E0220 06:46:47.970104 5094 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"default-cni-sysctl-allowlist\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.971928 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.979226 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-zr8rz"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.979620 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zr8rz" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.980455 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29bjc"] Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.981328 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.983179 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987047 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987210 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987322 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987647 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987816 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.987933 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.992749 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.993935 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 20 06:46:47 crc kubenswrapper[5094]: I0220 06:46:47.998632 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.036243 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.051197 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.051272 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7a14287aa061d6bafd555859cf0e70fa9a59fff7f5dbb8ebf7619d92d78b43cd"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.056416 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.056457 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"1b66ffe0a5e617d0ba35eabca4823ca8accc1ec1f5cd91eb035283a65ce25291"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.058159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8wch6" event={"ID":"3bc82500-7462-4daa-9eff-116399acb06a","Type":"ContainerStarted","Data":"4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.058225 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8wch6" event={"ID":"3bc82500-7462-4daa-9eff-116399acb06a","Type":"ContainerStarted","Data":"19ebcbfa6786c1ff4d74dff0e353d2abd0d6b245d46afa016754b5a228eff35e"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063158 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-hostroot\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063197 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063219 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063242 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-socket-dir-parent\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063304 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-bin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063327 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-multus-certs\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063351 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063397 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-netns\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063435 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063452 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-system-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063491 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063509 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063525 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063545 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-etc-kubernetes\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063564 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063581 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063599 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-system-cni-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063618 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-os-release\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063644 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063663 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-cnibin\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063680 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063712 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063744 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063761 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cni-binary-copy\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-k8s-cni-cncf-io\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063795 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-daemon-config\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063821 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063839 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-os-release\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063876 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-multus\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-conf-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fphl\" (UniqueName: \"kubernetes.io/projected/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-kube-api-access-8fphl\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063925 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-kubelet\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063958 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cnibin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063976 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.063993 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064010 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064025 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064042 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064058 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjkf4\" (UniqueName: \"kubernetes.io/projected/19cce34f-67a6-48c9-a396-621c5811b6cd-kube-api-access-cjkf4\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064477 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064501 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.064510 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7fbde06fa389195eb5092bab20c7c50624673f2d114a2da417f9632638713f3c"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.066632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"768c3ecca738877395d88ae13a22c6270e589ddbb87bdd0e770f9b8fbec53733"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.069664 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzxk2" event={"ID":"d81c0f95-7b6e-4a44-8115-f517fc8f4052","Type":"ContainerStarted","Data":"ae7b8fdfcbdaed00729e63d6458f951564b2a56987eb0e01cd5559425dff8cb9"} Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.069822 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.077225 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.080628 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.086370 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.109317 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.123639 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.135920 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.148594 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165623 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-kubelet\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165732 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cnibin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165750 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165774 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-kubelet\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165800 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165821 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165848 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165858 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjkf4\" (UniqueName: \"kubernetes.io/projected/19cce34f-67a6-48c9-a396-621c5811b6cd-kube-api-access-cjkf4\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165878 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165883 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165873 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165916 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165956 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165941 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cnibin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.165896 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166093 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166414 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166534 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-socket-dir-parent\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166609 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-socket-dir-parent\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166645 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166660 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-bin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166735 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-bin\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166739 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-hostroot\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-multus-certs\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-hostroot\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166934 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-multus-certs\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166953 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.166894 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167039 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167110 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167140 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167165 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-system-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167182 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-netns\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167209 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167228 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167246 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167264 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-etc-kubernetes\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167296 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167316 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-system-cni-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-os-release\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167352 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167375 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167386 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-etc-kubernetes\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167414 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-system-cni-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167390 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167410 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167468 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-cnibin\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167508 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-system-cni-dir\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167454 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167529 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167487 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-cnibin\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167595 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167489 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-netns\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167754 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167853 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/19cce34f-67a6-48c9-a396-621c5811b6cd-os-release\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167905 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168527 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.167941 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cni-binary-copy\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168587 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-k8s-cni-cncf-io\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168608 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-cni-binary-copy\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168610 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-daemon-config\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168655 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168677 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168696 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168728 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-os-release\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168746 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-multus\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168763 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-conf-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.168782 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fphl\" (UniqueName: \"kubernetes.io/projected/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-kube-api-access-8fphl\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169040 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-daemon-config\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169082 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-run-k8s-cni-cncf-io\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169126 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-os-release\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169158 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169431 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-binary-copy\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169478 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-host-var-lib-cni-multus\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.169512 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-multus-conf-dir\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.173125 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.174965 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.189327 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.191785 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") pod \"ovnkube-node-29bjc\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.204347 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.216431 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.225737 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.236785 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.251518 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.265508 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.278756 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.289655 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.303206 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:48Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.308631 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:48 crc kubenswrapper[5094]: W0220 06:46:48.328195 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1c36de3_d36b_48ed_9d4d_3aa52d72add0.slice/crio-7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656 WatchSource:0}: Error finding container 7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656: Status 404 returned error can't find the container with id 7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656 Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.573397 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.573640 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.5736057 +0000 UTC m=+25.446232411 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.574189 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.574274 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.574412 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574438 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574481 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574500 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574517 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574543 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.574531612 +0000 UTC m=+25.447158563 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574581 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574594 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.574465 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574610 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574747 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574585 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.574563573 +0000 UTC m=+25.447190294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574820 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.574778168 +0000 UTC m=+25.447405089 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.574861 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:50.57483847 +0000 UTC m=+25.447465491 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.793897 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 17:05:03.346301971 +0000 UTC Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.839235 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:48 crc kubenswrapper[5094]: I0220 06:46:48.839376 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.839448 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:48 crc kubenswrapper[5094]: E0220 06:46:48.839671 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.075552 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qzxk2" event={"ID":"d81c0f95-7b6e-4a44-8115-f517fc8f4052","Type":"ContainerStarted","Data":"5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855"} Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.090810 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" exitCode=0 Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.090934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.091032 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656"} Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.101060 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9"} Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.116135 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.135940 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.145160 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.148611 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.161778 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjkf4\" (UniqueName: \"kubernetes.io/projected/19cce34f-67a6-48c9-a396-621c5811b6cd-kube-api-access-cjkf4\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.161822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fphl\" (UniqueName: \"kubernetes.io/projected/c3900f6d-3035-4fc4-80a2-9e79154f4f5e-kube-api-access-8fphl\") pod \"multus-zr8rz\" (UID: \"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\") " pod="openshift-multus/multus-zr8rz" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.163124 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: E0220 06:46:49.167689 5094 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Feb 20 06:46:49 crc kubenswrapper[5094]: E0220 06:46:49.167777 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist podName:19cce34f-67a6-48c9-a396-621c5811b6cd nodeName:}" failed. No retries permitted until 2026-02-20 06:46:49.667757228 +0000 UTC m=+24.540383959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-9vd4p" (UID: "19cce34f-67a6-48c9-a396-621c5811b6cd") : failed to sync configmap cache: timed out waiting for the condition Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.179167 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.194926 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.195335 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.201186 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zr8rz" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.210927 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.226719 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.239853 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.256429 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.258234 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.272608 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.294863 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.315200 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.332981 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.350488 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.365943 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.383317 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.400714 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.419831 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.436258 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.450807 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.468196 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.497830 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.522624 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.549512 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.583229 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.631566 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.655599 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.672867 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.688755 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.689215 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:49Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.689959 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/19cce34f-67a6-48c9-a396-621c5811b6cd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9vd4p\" (UID: \"19cce34f-67a6-48c9-a396-621c5811b6cd\") " pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.776912 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" Feb 20 06:46:49 crc kubenswrapper[5094]: W0220 06:46:49.789355 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cce34f_67a6_48c9_a396_621c5811b6cd.slice/crio-4f0fac9e016292130a4e155241d6baca44a3c442472b1ffb753f7c30e43c149f WatchSource:0}: Error finding container 4f0fac9e016292130a4e155241d6baca44a3c442472b1ffb753f7c30e43c149f: Status 404 returned error can't find the container with id 4f0fac9e016292130a4e155241d6baca44a3c442472b1ffb753f7c30e43c149f Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.795186 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 14:25:49.006747948 +0000 UTC Feb 20 06:46:49 crc kubenswrapper[5094]: I0220 06:46:49.839414 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:49 crc kubenswrapper[5094]: E0220 06:46:49.839596 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.108993 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerStarted","Data":"4f0fac9e016292130a4e155241d6baca44a3c442472b1ffb753f7c30e43c149f"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.111401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.111444 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"2fb06227a0e267c5944978d6f12d04355e119b5a602d6c9141c724b852fb0281"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.115304 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.115353 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.115365 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.115374 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.134098 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.154155 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.168756 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.181501 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.199836 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.213995 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.229010 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.243355 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.255684 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.266613 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.282352 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.301225 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.324618 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.341556 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.366174 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:50Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.598909 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.599042 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.599071 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599107 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.599070531 +0000 UTC m=+29.471697252 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.599169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599183 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.599220 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599239 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.599225836 +0000 UTC m=+29.471852547 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599375 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599419 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.59941004 +0000 UTC m=+29.472036761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599532 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599548 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599561 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599596 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.599586354 +0000 UTC m=+29.472213075 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599596 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599620 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599631 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.599663 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:46:54.599655696 +0000 UTC m=+29.472282417 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.796092 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 13:15:01.919546015 +0000 UTC Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.839464 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:50 crc kubenswrapper[5094]: I0220 06:46:50.839558 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.839628 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:50 crc kubenswrapper[5094]: E0220 06:46:50.839790 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.130914 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d" exitCode=0 Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.131262 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d"} Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.135791 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c"} Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.143043 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.143123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.171164 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.197975 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.216825 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.237744 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.274814 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.292765 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.321254 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.338140 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.354335 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.367557 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.380281 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.397518 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.415463 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.429082 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.443072 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.458773 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.470810 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.483517 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.503372 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.522395 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.543208 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.569878 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.595460 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.609077 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.628952 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.650722 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.666015 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.675795 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.697190 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.712282 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:51Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.796246 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 15:22:22.257918386 +0000 UTC Feb 20 06:46:51 crc kubenswrapper[5094]: I0220 06:46:51.839156 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:51 crc kubenswrapper[5094]: E0220 06:46:51.839307 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.005168 5094 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.008454 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.008502 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.008512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.008660 5094 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.017362 5094 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.017671 5094 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026426 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.026556 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.050487 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.054897 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.054939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.054957 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.054975 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.055087 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.069755 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074133 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074184 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.074215 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.092412 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096568 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096619 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096638 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096663 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.096683 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.109778 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114182 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114247 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114263 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114293 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.114312 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.131095 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.131291 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133475 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.133509 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.148650 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d" exitCode=0 Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.150470 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.169194 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.196132 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.215860 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.232868 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239175 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239200 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.239256 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.250417 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.281529 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.332982 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341824 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341934 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341959 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.341978 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.360230 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.377749 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.400261 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.419961 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.433571 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446433 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446445 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.446475 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.452068 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.479453 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.502250 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:52Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549097 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549279 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549538 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549797 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.549888 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.653944 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.654403 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.654530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.654669 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.654830 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.757896 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.757996 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.758013 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.758050 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.758061 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.797338 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 12:19:28.192964668 +0000 UTC Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.839272 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.839477 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.840077 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:52 crc kubenswrapper[5094]: E0220 06:46:52.840263 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860792 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860913 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.860976 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964367 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:52 crc kubenswrapper[5094]: I0220 06:46:52.964408 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:52Z","lastTransitionTime":"2026-02-20T06:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.067998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.068073 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.068093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.068172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.068195 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.157020 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776" exitCode=0 Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.157124 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.171914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174813 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174863 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174883 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.174931 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.183314 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.218030 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.253755 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.277019 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.288936 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.289394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.289600 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.289838 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.290060 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.296208 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.318006 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.339611 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.354363 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.377369 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.394364 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.394957 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.395047 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.395063 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.395083 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.395098 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.414137 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.432908 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.451796 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.468375 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.487019 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:53Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.498637 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.498767 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.498902 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.499002 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.499111 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602535 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602605 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602622 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.602681 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706269 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706317 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706360 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.706379 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.798384 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 05:40:11.789592844 +0000 UTC Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811091 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811192 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.811243 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.839930 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:53 crc kubenswrapper[5094]: E0220 06:46:53.840204 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.914969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.915052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.915080 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.915115 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:53 crc kubenswrapper[5094]: I0220 06:46:53.915140 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:53Z","lastTransitionTime":"2026-02-20T06:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019275 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.019349 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123095 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123213 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123242 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.123261 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.182042 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd" exitCode=0 Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.182144 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.211418 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227614 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227694 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227751 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227789 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.227876 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.238238 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.259823 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.281967 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.311309 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332601 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332656 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332680 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332735 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.332758 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.336320 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.361775 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.385330 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.404377 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.426044 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436484 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436563 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436574 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436593 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.436607 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.445340 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.466974 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.497457 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540781 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540878 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540908 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.540930 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.582344 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.605569 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:54Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.645447 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.646056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.646068 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.646107 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.646123 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653161 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653393 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.653352616 +0000 UTC m=+37.525979327 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653716 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653768 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653800 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653823 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653912 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.653889528 +0000 UTC m=+37.526516279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653947 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.653991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.653970 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654118 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654131 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654153 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654182 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654244 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.654155225 +0000 UTC m=+37.526781976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654299 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.654282558 +0000 UTC m=+37.526909299 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.654338 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.654325009 +0000 UTC m=+37.526951760 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749073 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749147 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749166 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.749212 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.799043 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 01:45:15.327120053 +0000 UTC Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.839534 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.839561 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.839867 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:54 crc kubenswrapper[5094]: E0220 06:46:54.840000 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853040 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853097 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.853151 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956088 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956149 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956166 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956196 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:54 crc kubenswrapper[5094]: I0220 06:46:54.956214 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:54Z","lastTransitionTime":"2026-02-20T06:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059885 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059955 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059975 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.059999 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163467 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163562 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.163598 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.194531 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.195379 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.195449 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.201885 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f" exitCode=0 Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.201941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.215915 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.238328 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.238313 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.253764 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271595 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271652 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271698 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.271756 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.272072 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.293726 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.314089 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.331388 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.349436 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.365441 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374275 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374338 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374363 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.374377 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.397339 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.414095 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.430442 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.444189 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.472674 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476275 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476284 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476300 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.476311 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.489447 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.501132 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.513124 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.524495 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.538241 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.556187 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.574653 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579314 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579433 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579600 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.579672 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.589223 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.602869 5094 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.603055 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.646889 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.659824 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.672474 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.684319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.684799 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.684927 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.685033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.685134 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.686025 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.704632 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.725211 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.742515 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.788935 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.789216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.789291 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.789374 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.789454 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.800193 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 08:42:04.895454614 +0000 UTC Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.840040 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:55 crc kubenswrapper[5094]: E0220 06:46:55.840363 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.856700 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.873805 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.891638 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893613 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893645 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.893666 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.910463 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.941454 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.965151 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.986639 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:55Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997695 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997752 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997762 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997782 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:55 crc kubenswrapper[5094]: I0220 06:46:55.997794 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:55Z","lastTransitionTime":"2026-02-20T06:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.006239 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.030457 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.045909 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.060382 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.079600 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.095097 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102348 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102378 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102409 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.102427 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.113474 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.132277 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.205495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.205869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.205995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.206134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.206225 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.211719 5094 generic.go:334] "Generic (PLEG): container finished" podID="19cce34f-67a6-48c9-a396-621c5811b6cd" containerID="817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d" exitCode=0 Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.211862 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerDied","Data":"817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.212499 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.231470 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.248277 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.267571 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.289723 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.297908 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309733 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309783 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309794 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309810 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.309821 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.310798 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.333079 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.355395 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.372072 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.393382 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413002 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413064 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413220 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.413233 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.429207 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.445808 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.465566 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.484496 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.503451 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515396 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515417 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.515429 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.521247 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.538236 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.553781 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.569599 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.582779 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.599996 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.617872 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618285 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618338 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.618358 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.639131 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.656934 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.676681 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.706082 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.721671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.721910 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.721994 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.722092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.722181 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.729268 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.752457 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.768843 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.800968 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 17:05:15.545691176 +0000 UTC Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825395 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825414 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.825461 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.826265 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:56Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.839391 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.839416 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:56 crc kubenswrapper[5094]: E0220 06:46:56.839615 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:56 crc kubenswrapper[5094]: E0220 06:46:56.840005 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.928961 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.929283 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.929344 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.929456 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:56 crc kubenswrapper[5094]: I0220 06:46:56.929522 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:56Z","lastTransitionTime":"2026-02-20T06:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032820 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032898 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032913 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.032961 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.136993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.137040 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.137049 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.137071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.137084 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.224101 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" event={"ID":"19cce34f-67a6-48c9-a396-621c5811b6cd","Type":"ContainerStarted","Data":"a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240496 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240528 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240539 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240555 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.240588 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.250971 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.269898 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.290232 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.307431 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343660 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343731 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343749 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343815 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.343839 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.359178 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.385002 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.400406 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.415081 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.429297 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446676 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446690 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.446757 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.464375 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.482005 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.504204 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.527022 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549539 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.549556 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.555009 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.579618 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:57Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652763 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652816 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652826 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652843 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.652854 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755764 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755817 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755827 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755846 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.755857 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.801683 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 10:43:24.765455514 +0000 UTC Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.840050 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:57 crc kubenswrapper[5094]: E0220 06:46:57.840221 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.858962 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.859029 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.859046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.859070 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.859091 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963640 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963697 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963746 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963779 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:57 crc kubenswrapper[5094]: I0220 06:46:57.963800 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:57Z","lastTransitionTime":"2026-02-20T06:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068182 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068204 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068234 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.068253 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171280 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171340 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171358 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.171401 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.231855 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/0.log" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.238950 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e" exitCode=1 Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.240925 5094 scope.go:117] "RemoveContainer" containerID="0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.241352 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274845 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274907 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.274949 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.279147 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.310697 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.332028 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.350348 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.373606 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:58Z\\\",\\\"message\\\":\\\"etworkPolicy event handler 4 for removal\\\\nI0220 06:46:57.986434 6455 factory.go:656] Stopping watch factory\\\\nI0220 06:46:57.986441 6455 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986527 6455 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986658 6455 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.986382 6455 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.987032 6455 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.987247 6455 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 06:46:57.986381 6455 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:46:57.987273 6455 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.377960 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.377988 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.377998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.378017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.378027 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.389698 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.406395 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.427987 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.449646 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.464737 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.477001 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481371 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481471 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.481551 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.492937 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.514837 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.534857 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.551742 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:58Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584688 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584750 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584763 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584780 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.584789 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688533 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688580 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688591 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688609 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.688620 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791454 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791476 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.791491 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.802053 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 08:52:38.563907799 +0000 UTC Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.839984 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.840076 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:46:58 crc kubenswrapper[5094]: E0220 06:46:58.840308 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:46:58 crc kubenswrapper[5094]: E0220 06:46:58.840495 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894548 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894603 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894638 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.894652 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997821 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997873 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997911 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:58 crc kubenswrapper[5094]: I0220 06:46:58.997924 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:58Z","lastTransitionTime":"2026-02-20T06:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100425 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100433 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100453 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.100463 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202851 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202891 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202905 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202924 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.202937 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.247429 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/0.log" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.251607 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.252294 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.272200 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.290398 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.307932 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.307986 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.307998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.308029 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.308044 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.314345 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.330597 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.354991 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.379031 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.399010 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411317 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411521 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411744 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.411974 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.425083 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:58Z\\\",\\\"message\\\":\\\"etworkPolicy event handler 4 for removal\\\\nI0220 06:46:57.986434 6455 factory.go:656] Stopping watch factory\\\\nI0220 06:46:57.986441 6455 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986527 6455 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986658 6455 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.986382 6455 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.987032 6455 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.987247 6455 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 06:46:57.986381 6455 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:46:57.987273 6455 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.456494 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.471517 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.487525 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.503815 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515065 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515118 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515161 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.515180 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.517084 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.529070 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.552276 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:46:59Z is after 2025-08-24T17:21:41Z" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619381 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619484 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619514 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.619532 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723264 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723308 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723317 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723332 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.723342 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.802924 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 00:12:34.551280443 +0000 UTC Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825608 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825617 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825634 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.825649 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.840240 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:46:59 crc kubenswrapper[5094]: E0220 06:46:59.840373 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929015 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929064 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929075 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929095 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:46:59 crc kubenswrapper[5094]: I0220 06:46:59.929107 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:46:59Z","lastTransitionTime":"2026-02-20T06:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031630 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031687 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031730 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031758 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.031777 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.133977 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.134082 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.134103 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.134128 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.134147 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237856 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237906 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237940 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.237970 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.258424 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/1.log" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.259289 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/0.log" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.263019 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" exitCode=1 Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.263053 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.263131 5094 scope.go:117] "RemoveContainer" containerID="0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.264184 5094 scope.go:117] "RemoveContainer" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" Feb 20 06:47:00 crc kubenswrapper[5094]: E0220 06:47:00.264434 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.282102 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.303871 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.331279 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341240 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341268 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.341286 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.358002 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.376579 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.399441 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.419680 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.440635 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443826 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443876 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443888 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443905 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.443919 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.459192 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.491440 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:58Z\\\",\\\"message\\\":\\\"etworkPolicy event handler 4 for removal\\\\nI0220 06:46:57.986434 6455 factory.go:656] Stopping watch factory\\\\nI0220 06:46:57.986441 6455 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986527 6455 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986658 6455 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.986382 6455 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.987032 6455 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.987247 6455 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 06:46:57.986381 6455 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:46:57.987273 6455 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.521857 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.539161 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546556 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.546607 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.556020 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.573392 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.593108 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:00Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.650543 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.650887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.651069 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.651275 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.651879 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755419 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755470 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755482 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755554 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.755572 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.803426 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 00:55:54.27389845 +0000 UTC Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.841793 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:00 crc kubenswrapper[5094]: E0220 06:47:00.841994 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.841807 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:00 crc kubenswrapper[5094]: E0220 06:47:00.842596 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858494 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858519 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.858529 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961438 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961489 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961507 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:00 crc kubenswrapper[5094]: I0220 06:47:00.961551 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:00Z","lastTransitionTime":"2026-02-20T06:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064167 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064226 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064242 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064264 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.064281 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.132388 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f"] Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.133307 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.136200 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.136343 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.158448 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168379 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168428 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.168487 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.175326 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.187760 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.210682 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.231988 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fefb2c2f6cc86094f81a7b5545706f81ffcfe3c3f92d0cb521079734da9f14e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:58Z\\\",\\\"message\\\":\\\"etworkPolicy event handler 4 for removal\\\\nI0220 06:46:57.986434 6455 factory.go:656] Stopping watch factory\\\\nI0220 06:46:57.986441 6455 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986527 6455 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.986658 6455 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.986382 6455 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0220 06:46:57.987032 6455 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0220 06:46:57.987247 6455 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0220 06:46:57.986381 6455 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:46:57.987273 6455 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.233118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.233195 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.233318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60f18419-2e46-4911-bceb-d8651c9fac66-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.233354 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr64j\" (UniqueName: \"kubernetes.io/projected/60f18419-2e46-4911-bceb-d8651c9fac66-kube-api-access-xr64j\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.255357 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.274459 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275628 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275752 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275804 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275832 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275859 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/1.log" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.275852 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.281697 5094 scope.go:117] "RemoveContainer" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" Feb 20 06:47:01 crc kubenswrapper[5094]: E0220 06:47:01.282018 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.289876 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.310173 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.332362 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.334895 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.335031 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60f18419-2e46-4911-bceb-d8651c9fac66-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.335075 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr64j\" (UniqueName: \"kubernetes.io/projected/60f18419-2e46-4911-bceb-d8651c9fac66-kube-api-access-xr64j\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.335203 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.336357 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.336548 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60f18419-2e46-4911-bceb-d8651c9fac66-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.347637 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60f18419-2e46-4911-bceb-d8651c9fac66-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.351612 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.358063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr64j\" (UniqueName: \"kubernetes.io/projected/60f18419-2e46-4911-bceb-d8651c9fac66-kube-api-access-xr64j\") pod \"ovnkube-control-plane-749d76644c-qrs4f\" (UID: \"60f18419-2e46-4911-bceb-d8651c9fac66\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.368623 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379650 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379793 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379940 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.379961 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.391571 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.414053 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.443857 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.447141 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.459101 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.477480 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.482961 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.482997 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.483009 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.483028 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.483041 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.494442 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.510646 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.532734 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.553775 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.572638 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586205 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586248 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586259 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.586290 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.589243 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.609216 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.627179 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.649217 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.663411 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.679980 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689114 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689193 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689217 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.689232 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.695449 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.723271 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.736834 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.748904 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792500 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792596 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792693 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.792792 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.803888 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 00:07:20.193741285 +0000 UTC Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.840188 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:01 crc kubenswrapper[5094]: E0220 06:47:01.840461 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.890392 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8ww4n"] Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.890917 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:01 crc kubenswrapper[5094]: E0220 06:47:01.890976 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.897786 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.897892 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.897948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.897987 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.898057 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:01Z","lastTransitionTime":"2026-02-20T06:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.898991 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.910556 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.927100 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.941269 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.942333 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.942409 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhrtm\" (UniqueName: \"kubernetes.io/projected/da0aa093-1adc-45f2-a942-e68d7be23ed4-kube-api-access-mhrtm\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.968573 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:01 crc kubenswrapper[5094]: I0220 06:47:01.985621 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001291 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001377 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001422 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001520 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.001738 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:01Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.016156 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.031200 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.043444 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.043578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhrtm\" (UniqueName: \"kubernetes.io/projected/da0aa093-1adc-45f2-a942-e68d7be23ed4-kube-api-access-mhrtm\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.044051 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.044378 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:02.54429127 +0000 UTC m=+37.416918191 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.050911 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.063936 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhrtm\" (UniqueName: \"kubernetes.io/projected/da0aa093-1adc-45f2-a942-e68d7be23ed4-kube-api-access-mhrtm\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.071294 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.091283 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.105819 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.106078 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.106152 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.106217 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.106273 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.108225 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.138571 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.166028 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.186266 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.205672 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211496 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211562 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211582 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.211631 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.223625 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.239981 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.256459 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.275005 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.289023 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" event={"ID":"60f18419-2e46-4911-bceb-d8651c9fac66","Type":"ContainerStarted","Data":"c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.289111 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" event={"ID":"60f18419-2e46-4911-bceb-d8651c9fac66","Type":"ContainerStarted","Data":"c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.289150 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" event={"ID":"60f18419-2e46-4911-bceb-d8651c9fac66","Type":"ContainerStarted","Data":"7c24a78f0ce2423b8e774c11a286a2e80caee250864166c38a97deed16cb5641"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.291076 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.307086 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314816 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314900 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.314948 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.322256 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.344650 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.380059 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.405863 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.417553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.417798 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.417889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.417966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.418055 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.428003 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.444932 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.444989 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.445008 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.445030 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.445043 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.454132 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.459211 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463816 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463854 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463895 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463915 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.463927 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.470543 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.483072 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488222 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.488237 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.504805 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.506226 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510231 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510271 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510285 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510305 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.510324 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.523146 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.525264 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.527978 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.528013 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.528024 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.528044 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.528057 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.541352 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.541478 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.542927 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543641 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543680 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543719 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.543730 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.549348 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.549562 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.549876 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:03.549617598 +0000 UTC m=+38.422244329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.563341 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.582398 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.593347 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.608330 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.628030 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.643419 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652119 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652137 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.652192 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.657190 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.678369 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.700042 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.719418 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.738854 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.751579 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.751832 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.751923 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.751875112 +0000 UTC m=+53.624501863 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.752069 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752085 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.752140 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752220 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.752185069 +0000 UTC m=+53.624811810 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.752277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752328 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752430 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752473 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.752445466 +0000 UTC m=+53.625072227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752490 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752519 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752451 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752626 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.752586249 +0000 UTC m=+53.625213020 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.752611 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.753116 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.753164 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:18.753154943 +0000 UTC m=+53.625781654 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755233 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755306 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755320 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755350 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.755366 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.756935 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.780940 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.799009 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.804409 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 05:28:51.334857578 +0000 UTC Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.813338 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.835392 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.839251 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.839332 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.839390 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:02 crc kubenswrapper[5094]: E0220 06:47:02.839541 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.858889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.859180 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.859245 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.859347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.859411 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.858854 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.879237 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.892856 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:02Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963827 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963893 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963912 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963941 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:02 crc kubenswrapper[5094]: I0220 06:47:02.963961 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:02Z","lastTransitionTime":"2026-02-20T06:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067634 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067657 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.067742 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.171819 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.171917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.171944 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.171983 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.172005 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275805 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275850 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275862 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275881 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.275893 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378761 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378797 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378840 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.378867 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482543 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482570 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.482589 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.562032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:03 crc kubenswrapper[5094]: E0220 06:47:03.562256 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:03 crc kubenswrapper[5094]: E0220 06:47:03.562379 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:05.562350194 +0000 UTC m=+40.434976935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.585912 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.585983 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.586007 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.586039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.586063 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690424 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690582 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690760 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.690890 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795088 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795147 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795160 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.795190 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.804811 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 08:57:49.080080649 +0000 UTC Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.839633 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:03 crc kubenswrapper[5094]: E0220 06:47:03.839868 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.840479 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:03 crc kubenswrapper[5094]: E0220 06:47:03.840604 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899132 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899274 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899343 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899374 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:03 crc kubenswrapper[5094]: I0220 06:47:03.899435 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:03Z","lastTransitionTime":"2026-02-20T06:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002352 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002403 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.002434 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.105416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.106025 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.106258 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.106537 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.106630 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210603 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.210679 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315063 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315126 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315148 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315175 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.315194 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418774 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418818 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418829 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.418861 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522581 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522668 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522695 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522769 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.522793 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626160 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626207 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.626226 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730635 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730654 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730685 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.730737 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.805610 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 09:59:03.639557747 +0000 UTC Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834806 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834897 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834930 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.834953 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.839115 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.839174 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:04 crc kubenswrapper[5094]: E0220 06:47:04.839437 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:04 crc kubenswrapper[5094]: E0220 06:47:04.839594 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938144 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938223 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938243 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938273 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:04 crc kubenswrapper[5094]: I0220 06:47:04.938292 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:04Z","lastTransitionTime":"2026-02-20T06:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041644 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041882 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041952 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.041974 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146604 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146768 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.146786 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250560 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250584 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250620 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.250645 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354291 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354370 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.354439 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458317 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458413 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.458462 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561611 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561644 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.561670 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.596602 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:05 crc kubenswrapper[5094]: E0220 06:47:05.596856 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:05 crc kubenswrapper[5094]: E0220 06:47:05.596978 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:09.596944965 +0000 UTC m=+44.469571716 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.665238 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.665609 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.665853 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.666080 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.666232 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.768907 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.768973 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.768993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.769020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.769041 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.806298 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 19:02:48.86184556 +0000 UTC Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.839958 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:05 crc kubenswrapper[5094]: E0220 06:47:05.840186 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.840323 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:05 crc kubenswrapper[5094]: E0220 06:47:05.840496 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874239 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874349 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874377 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874410 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.874445 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.877969 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.896969 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.918788 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.941870 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.976129 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.977755 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.977858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.977884 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.977976 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:05 crc kubenswrapper[5094]: I0220 06:47:05.978040 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:05Z","lastTransitionTime":"2026-02-20T06:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.000389 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:05Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.020843 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.045090 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.062145 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.081413 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083053 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083116 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083161 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.083182 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.103851 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.120003 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.135387 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.157245 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.178758 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185741 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185770 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.185790 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.198873 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.219868 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:06Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.288949 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.289020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.289039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.289068 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.289091 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393231 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.393317 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496286 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496313 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.496371 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601072 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.601129 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705730 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705823 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705848 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705879 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.705901 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.806775 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 09:19:36.548553188 +0000 UTC Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.811612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.813956 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.814266 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.814483 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.814683 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.839302 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:06 crc kubenswrapper[5094]: E0220 06:47:06.839466 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.839302 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:06 crc kubenswrapper[5094]: E0220 06:47:06.839965 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919408 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919494 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:06 crc kubenswrapper[5094]: I0220 06:47:06.919540 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:06Z","lastTransitionTime":"2026-02-20T06:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022775 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022861 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022890 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.022911 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125809 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125884 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125899 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125920 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.125937 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229305 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229437 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.229455 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332540 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.332613 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437058 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437137 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437158 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437186 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.437206 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539670 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539731 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539744 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539759 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.539769 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.643884 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.644027 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.644057 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.644097 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.644121 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.747659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.747888 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.747925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.748014 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.748138 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.807845 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 09:38:00.819387497 +0000 UTC Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.839687 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.839695 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:07 crc kubenswrapper[5094]: E0220 06:47:07.840010 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:07 crc kubenswrapper[5094]: E0220 06:47:07.840079 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851568 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851678 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851797 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.851814 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955596 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955767 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955793 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955821 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:07 crc kubenswrapper[5094]: I0220 06:47:07.955844 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:07Z","lastTransitionTime":"2026-02-20T06:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.058926 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.059017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.059039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.059077 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.059099 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162326 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162427 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.162440 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265518 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265575 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265610 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.265629 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367697 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367812 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367866 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.367887 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472410 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472493 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472540 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.472562 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576401 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576465 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576504 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576537 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.576560 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679640 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679693 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679761 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.679781 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782549 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782567 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.782685 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.808402 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 03:17:32.287849881 +0000 UTC Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.840036 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:08 crc kubenswrapper[5094]: E0220 06:47:08.840210 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.840406 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:08 crc kubenswrapper[5094]: E0220 06:47:08.840862 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886458 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886515 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886547 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.886562 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990543 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990611 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990626 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990649 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:08 crc kubenswrapper[5094]: I0220 06:47:08.990665 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:08Z","lastTransitionTime":"2026-02-20T06:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093110 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093162 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093199 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.093211 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.196944 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.197375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.197523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.197681 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.197843 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302054 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302124 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302143 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302171 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.302191 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405960 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.405981 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510254 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510376 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.510401 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.613776 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.614108 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.614449 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.614578 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.614801 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.652777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:09 crc kubenswrapper[5094]: E0220 06:47:09.652944 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:09 crc kubenswrapper[5094]: E0220 06:47:09.653345 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:17.65331649 +0000 UTC m=+52.525943241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718872 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718968 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.718985 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.808898 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 03:59:50.121048578 +0000 UTC Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822397 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822418 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822452 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.822472 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.839487 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.839540 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:09 crc kubenswrapper[5094]: E0220 06:47:09.839746 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:09 crc kubenswrapper[5094]: E0220 06:47:09.839910 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926683 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926780 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926797 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926826 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:09 crc kubenswrapper[5094]: I0220 06:47:09.926844 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:09Z","lastTransitionTime":"2026-02-20T06:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030429 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030447 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030476 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.030501 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134552 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134608 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.134628 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.238211 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.238827 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.239021 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.239201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.239367 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343246 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343260 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343280 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.343293 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446525 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446573 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446591 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446653 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.446671 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.551011 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.551398 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.551612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.551862 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.552100 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656175 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656199 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.656253 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.759810 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.760145 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.760341 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.760527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.760773 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.810739 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 21:12:01.892840054 +0000 UTC Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.839332 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.839372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:10 crc kubenswrapper[5094]: E0220 06:47:10.839529 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:10 crc kubenswrapper[5094]: E0220 06:47:10.839678 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864670 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864820 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864847 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.864904 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968567 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968641 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968697 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:10 crc kubenswrapper[5094]: I0220 06:47:10.968765 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:10Z","lastTransitionTime":"2026-02-20T06:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072432 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072520 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072546 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.072606 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.176636 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.176699 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.177000 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.177038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.177349 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.282201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.282585 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.282852 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.283056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.283236 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387064 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387132 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387151 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.387199 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.490368 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.490923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.491078 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.491258 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.491402 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595497 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595586 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595606 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595642 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.595672 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.700466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.701156 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.701224 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.701273 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.701302 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804313 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.804443 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.811514 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 01:20:13.008437076 +0000 UTC Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.840231 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.840393 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:11 crc kubenswrapper[5094]: E0220 06:47:11.840997 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.841081 5094 scope.go:117] "RemoveContainer" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" Feb 20 06:47:11 crc kubenswrapper[5094]: E0220 06:47:11.841190 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908473 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908490 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908516 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:11 crc kubenswrapper[5094]: I0220 06:47:11.908535 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:11Z","lastTransitionTime":"2026-02-20T06:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013248 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013273 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013315 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.013340 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116777 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116824 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116841 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116866 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.116885 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221279 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221363 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221381 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221411 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.221432 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325300 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325368 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325388 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.325433 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.334262 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/1.log" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.338521 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.339405 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.364801 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.390644 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.411755 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437360 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437431 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.437470 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.438887 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.463801 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.483460 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.502808 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.519553 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.530346 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539805 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539846 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539859 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539879 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.539891 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.543568 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.557478 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.571083 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.584373 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.612670 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642845 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642891 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642905 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.642938 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.645045 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.658353 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.669984 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.747749 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.748204 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.748247 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.748280 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.748301 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.811912 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 14:58:53.881092253 +0000 UTC Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.840214 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.840274 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.840455 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.840609 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852059 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852114 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852131 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.852180 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858325 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858378 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858405 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.858424 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.880832 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886611 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886688 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886748 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886786 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.886812 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.910269 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917351 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917446 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.917506 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.936353 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943202 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943241 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943253 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943273 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.943285 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.960860 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967079 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967158 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967192 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.967213 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.983043 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:12 crc kubenswrapper[5094]: E0220 06:47:12.983406 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.985617 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.985873 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.986047 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.986391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:12 crc kubenswrapper[5094]: I0220 06:47:12.986692 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:12Z","lastTransitionTime":"2026-02-20T06:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091535 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091560 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091610 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.091634 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195418 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195496 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195516 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195550 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.195578 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.299955 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.300012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.300030 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.300052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.300071 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.347556 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/2.log" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.349073 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/1.log" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.354478 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" exitCode=1 Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.354546 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.354624 5094 scope.go:117] "RemoveContainer" containerID="714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.355826 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:13 crc kubenswrapper[5094]: E0220 06:47:13.356159 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.379129 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.401604 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.404983 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.405099 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.405178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.405249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.405313 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.418693 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.438274 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.463530 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.480769 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.504324 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.509255 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.526602 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.549811 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.570792 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.600615 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.625984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.626054 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.626071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.626097 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.626112 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.649727 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.673110 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.693829 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.717838 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://714aa4516e8be59d4b1e3755843c3866d4d35134c8a65efca5d796cdec581ba9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:46:59Z\\\",\\\"message\\\":\\\"c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.140\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0220 06:46:59.340011 6610 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:46:59.340052 6610 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:46:59.340153 6610 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729031 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729160 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729267 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729379 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.729468 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.734456 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.746097 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:13Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.813203 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 07:41:31.611712625 +0000 UTC Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.833528 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.833668 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.833837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.833969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.834039 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.840047 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:13 crc kubenswrapper[5094]: E0220 06:47:13.840257 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.840088 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:13 crc kubenswrapper[5094]: E0220 06:47:13.840499 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937840 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937900 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937920 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937947 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:13 crc kubenswrapper[5094]: I0220 06:47:13.937966 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:13Z","lastTransitionTime":"2026-02-20T06:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042373 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.042395 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146263 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146343 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.146412 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249379 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249450 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249468 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249498 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.249516 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353276 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353355 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353373 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.353426 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.361544 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/2.log" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.368567 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:14 crc kubenswrapper[5094]: E0220 06:47:14.368890 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.390862 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.408506 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.435081 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.455422 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456349 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456366 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.456411 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.468449 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.485810 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.506873 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.539201 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.559938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.560038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.560065 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.560112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.560131 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.562801 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.588549 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.608847 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.629521 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.663858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.664092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.664197 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.663944 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.664296 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.664507 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.703142 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.724411 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.746259 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768043 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768114 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768136 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768166 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.768186 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.769617 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:14Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.813585 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 04:34:20.82517273 +0000 UTC Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.839588 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.839697 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:14 crc kubenswrapper[5094]: E0220 06:47:14.839813 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:14 crc kubenswrapper[5094]: E0220 06:47:14.839910 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.871539 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.872006 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.872187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.872361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.872520 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976624 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976643 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:14 crc kubenswrapper[5094]: I0220 06:47:14.976690 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:14Z","lastTransitionTime":"2026-02-20T06:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080378 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.080545 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.184369 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.184875 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.185060 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.185226 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.185381 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289639 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289695 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.289753 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393490 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393564 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393584 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.393636 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.504783 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.505313 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.505823 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.506295 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.506656 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.610901 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.610992 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.611017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.611082 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.611109 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715295 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715363 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715381 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715409 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.715428 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.814929 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 04:53:42.292504849 +0000 UTC Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819261 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819327 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819346 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.819397 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.839300 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.839372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:15 crc kubenswrapper[5094]: E0220 06:47:15.839585 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:15 crc kubenswrapper[5094]: E0220 06:47:15.839791 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.868510 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.892812 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.912689 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923505 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923526 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923555 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.923573 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:15Z","lastTransitionTime":"2026-02-20T06:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.936619 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.971625 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:15 crc kubenswrapper[5094]: I0220 06:47:15.995251 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:15Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.015559 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.026913 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.027252 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.027477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.027696 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.027919 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.032608 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.056195 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.079239 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.093018 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.094779 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.105934 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.112192 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131521 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.131548 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.134374 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.161219 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.180657 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.199339 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.230614 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235197 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235235 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235284 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235312 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.235334 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.254737 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.280049 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.304334 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.325683 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.339948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.340033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.340056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.340087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.340105 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.363417 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.402098 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.423068 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443919 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443940 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443974 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.443993 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.484825 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.509550 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.531040 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.550355 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.550560 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.550731 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.550887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.551024 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.551328 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.569157 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.595034 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.614964 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.634426 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656239 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656449 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656774 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.656938 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.657644 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.679371 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.699010 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:16Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760377 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760438 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.760505 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.815373 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 13:38:06.980224335 +0000 UTC Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.840075 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:16 crc kubenswrapper[5094]: E0220 06:47:16.840320 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.840632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:16 crc kubenswrapper[5094]: E0220 06:47:16.841034 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.863984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.864491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.864750 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.864966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.865118 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.968527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.969054 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.969244 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.969416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:16 crc kubenswrapper[5094]: I0220 06:47:16.969615 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:16Z","lastTransitionTime":"2026-02-20T06:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.072618 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.073106 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.073277 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.073451 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.073584 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177697 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177802 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177820 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177849 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.177867 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281767 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281830 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281848 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281874 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.281892 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384847 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384920 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.384984 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488694 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488800 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488829 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488857 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.488887 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592423 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592498 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592525 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.592548 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.663190 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:17 crc kubenswrapper[5094]: E0220 06:47:17.663453 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:17 crc kubenswrapper[5094]: E0220 06:47:17.663588 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:33.663559149 +0000 UTC m=+68.536185870 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695499 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695572 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695599 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.695620 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799388 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799488 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.799549 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.816073 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 19:01:02.41531012 +0000 UTC Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.839479 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.839557 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:17 crc kubenswrapper[5094]: E0220 06:47:17.839681 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:17 crc kubenswrapper[5094]: E0220 06:47:17.839975 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903303 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903355 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903368 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:17 crc kubenswrapper[5094]: I0220 06:47:17.903408 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:17Z","lastTransitionTime":"2026-02-20T06:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006609 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006684 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.006757 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110504 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110597 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110624 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.110644 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213761 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213840 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213895 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.213919 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317119 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317528 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.317550 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.421822 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.421912 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.421940 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.421978 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.422001 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.525943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.526047 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.526070 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.526134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.526248 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630277 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630352 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630403 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.630424 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.733965 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.734043 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.734066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.734099 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.734122 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779118 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779315 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779369 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.779329178 +0000 UTC m=+85.651955929 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779488 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779617 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.779579394 +0000 UTC m=+85.652206145 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779683 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779755 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779778 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779804 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.779674 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779871 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779896 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779828 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.779839 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.779823351 +0000 UTC m=+85.652450092 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.780091 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.780071947 +0000 UTC m=+85.652698688 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.780138 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:47:50.780123318 +0000 UTC m=+85.652750059 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.817281 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 07:01:30.418132918 +0000 UTC Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837645 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837750 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837771 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837804 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.837826 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.839865 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.839924 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.840083 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:18 crc kubenswrapper[5094]: E0220 06:47:18.840289 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942086 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942146 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942165 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:18 crc kubenswrapper[5094]: I0220 06:47:18.942213 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:18Z","lastTransitionTime":"2026-02-20T06:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045758 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045822 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045841 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.045949 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.149960 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.150035 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.150056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.150084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.150103 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.253944 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.254033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.254057 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.254092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.254117 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357831 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357959 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.357980 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461825 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461902 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461920 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461952 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.461971 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566037 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566212 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566242 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566269 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.566291 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670600 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670621 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670664 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.670683 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774601 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774685 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774747 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.774770 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.818406 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 08:28:36.466273603 +0000 UTC Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.839986 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.840071 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:19 crc kubenswrapper[5094]: E0220 06:47:19.840217 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:19 crc kubenswrapper[5094]: E0220 06:47:19.840382 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.885739 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.885861 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.885889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.886653 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.886748 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990619 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990742 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990763 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990792 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:19 crc kubenswrapper[5094]: I0220 06:47:19.990814 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:19Z","lastTransitionTime":"2026-02-20T06:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095376 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095454 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095474 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.095531 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199501 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199577 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199597 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199629 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.199653 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303285 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303363 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303409 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.303430 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406436 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406531 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.406576 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509369 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509439 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509458 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509486 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.509506 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.612917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.612994 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.613012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.613043 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.613063 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716632 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716651 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716682 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.716739 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.818571 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 12:35:13.106632418 +0000 UTC Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820556 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820615 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.820636 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.839144 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.839238 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:20 crc kubenswrapper[5094]: E0220 06:47:20.839363 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:20 crc kubenswrapper[5094]: E0220 06:47:20.839488 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924026 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924086 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924105 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924136 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:20 crc kubenswrapper[5094]: I0220 06:47:20.924154 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:20Z","lastTransitionTime":"2026-02-20T06:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.027946 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.028048 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.028069 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.028131 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.028154 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131424 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131473 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131517 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.131536 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.233853 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.233935 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.233954 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.233985 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.234005 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337765 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337837 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337856 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.337908 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441419 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441588 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.441621 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.545882 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.545958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.545978 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.546008 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.546035 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.649981 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.650055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.650077 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.650104 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.650123 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754072 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754150 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754174 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.754196 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.818799 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 09:21:07.278432493 +0000 UTC Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.839757 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.839929 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:21 crc kubenswrapper[5094]: E0220 06:47:21.840170 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:21 crc kubenswrapper[5094]: E0220 06:47:21.840492 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857095 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857146 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857163 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.857176 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960063 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960127 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960145 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:21 crc kubenswrapper[5094]: I0220 06:47:21.960194 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:21Z","lastTransitionTime":"2026-02-20T06:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063156 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063684 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063743 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063776 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.063798 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168338 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168440 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168467 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.168488 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271469 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271541 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271588 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.271610 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375429 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.375520 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478577 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478595 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478626 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.478646 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.581921 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.581999 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.582022 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.582052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.582073 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686251 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686341 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.686356 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.789914 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.789984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.790004 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.790036 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.790057 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.819312 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 04:24:52.401242861 +0000 UTC Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.839983 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.840043 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:22 crc kubenswrapper[5094]: E0220 06:47:22.840184 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:22 crc kubenswrapper[5094]: E0220 06:47:22.840384 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893509 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893529 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.893582 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996632 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996680 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996698 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996761 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:22 crc kubenswrapper[5094]: I0220 06:47:22.996785 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:22Z","lastTransitionTime":"2026-02-20T06:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101123 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101176 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101218 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.101237 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104569 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104586 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.104630 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.127889 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.133945 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.134057 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.134122 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.134154 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.134213 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.157447 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163461 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163526 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163544 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163574 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.163593 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.187067 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192683 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192769 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192787 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192811 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.192828 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.214230 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220361 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220440 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.220524 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.242999 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:23Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.243274 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245569 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245624 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245642 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.245685 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349604 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349655 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.349678 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453818 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453902 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453954 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.453973 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557435 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557552 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.557637 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.662456 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.662858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.663071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.663282 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.663442 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768799 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768898 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768918 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768950 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.768974 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.820342 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 01:27:12.749824603 +0000 UTC Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.841135 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.841219 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.841403 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:23 crc kubenswrapper[5094]: E0220 06:47:23.841660 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.872981 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.873294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.873448 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.873593 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.873788 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.976963 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.977018 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.977036 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.977063 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:23 crc kubenswrapper[5094]: I0220 06:47:23.977086 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:23Z","lastTransitionTime":"2026-02-20T06:47:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.080929 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.080998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.081017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.081044 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.081065 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184448 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184537 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184590 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.184611 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288165 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288529 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288600 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.288671 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392127 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392549 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.392688 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496525 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496610 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496635 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496674 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.496699 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.600465 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.600867 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.601019 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.601166 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.601306 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.705916 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.705985 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.706006 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.706037 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.706057 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809432 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809453 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.809504 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.821674 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 14:21:51.538317472 +0000 UTC Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.840186 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.840372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:24 crc kubenswrapper[5094]: E0220 06:47:24.840542 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:24 crc kubenswrapper[5094]: E0220 06:47:24.840837 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.913741 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.914923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.915021 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.915065 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:24 crc kubenswrapper[5094]: I0220 06:47:24.915091 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:24Z","lastTransitionTime":"2026-02-20T06:47:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018309 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018652 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018881 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.018966 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.123917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.124039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.124116 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.124147 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.124223 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227439 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227531 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227551 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227577 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.227594 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331017 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331127 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331171 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.331184 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.434914 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.435134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.435169 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.435204 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.435226 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539016 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539115 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539180 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.539201 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642817 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642906 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642951 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.642966 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746482 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746536 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746576 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.746594 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.822806 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 15:22:52.998217742 +0000 UTC Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.839491 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:25 crc kubenswrapper[5094]: E0220 06:47:25.839764 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.839870 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:25 crc kubenswrapper[5094]: E0220 06:47:25.840122 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.852327 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.854472 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.854665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.854943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.855153 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.859879 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.880477 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.902383 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.924087 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.940232 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.955877 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959380 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959423 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959467 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.959488 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:25Z","lastTransitionTime":"2026-02-20T06:47:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:25 crc kubenswrapper[5094]: I0220 06:47:25.981928 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:25Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.006174 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.028069 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.046800 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.063888 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.063991 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.064052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.064132 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.064239 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.070305 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.103954 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.126831 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.148483 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172164 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172237 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172261 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172300 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.172331 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.176985 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.213646 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.236441 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.255540 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:26Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.275903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.275971 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.275991 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.276025 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.276048 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380341 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380517 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.380543 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484117 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484175 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484189 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484210 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.484225 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587078 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587179 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587209 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.587230 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691366 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691439 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691485 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.691507 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.795666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.796265 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.796605 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.796817 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.796991 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.823513 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 04:32:30.676341913 +0000 UTC Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.839565 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.839569 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:26 crc kubenswrapper[5094]: E0220 06:47:26.839757 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:26 crc kubenswrapper[5094]: E0220 06:47:26.839933 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904031 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904132 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904164 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:26 crc kubenswrapper[5094]: I0220 06:47:26.904184 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:26Z","lastTransitionTime":"2026-02-20T06:47:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008037 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008121 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.008193 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112118 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112187 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112222 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.112236 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215209 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215284 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.215353 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318597 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318663 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318682 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318755 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.318784 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422467 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422483 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422510 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.422536 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526191 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526219 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526257 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.526283 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.629968 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.630042 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.630062 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.630093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.630112 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733414 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733484 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733529 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.733550 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.823914 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 10:46:39.093311238 +0000 UTC Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836757 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836826 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836848 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836879 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.836898 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.840329 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.840557 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:27 crc kubenswrapper[5094]: E0220 06:47:27.840846 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:27 crc kubenswrapper[5094]: E0220 06:47:27.840988 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941289 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941357 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941404 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:27 crc kubenswrapper[5094]: I0220 06:47:27.941423 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:27Z","lastTransitionTime":"2026-02-20T06:47:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.044939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.045025 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.045052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.045093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.045127 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148546 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148573 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148610 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.148638 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252009 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252083 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.252098 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355544 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355565 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355595 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.355615 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459204 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459290 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459352 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.459410 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563145 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563239 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563268 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.563286 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.666973 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.667085 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.667111 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.667151 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.667175 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770655 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770754 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770775 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770808 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.770828 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.825108 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 19:24:25.213094099 +0000 UTC Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.839573 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.839694 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:28 crc kubenswrapper[5094]: E0220 06:47:28.840518 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:28 crc kubenswrapper[5094]: E0220 06:47:28.840626 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.841282 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:28 crc kubenswrapper[5094]: E0220 06:47:28.841884 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875230 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875301 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875346 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.875364 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.977915 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.977979 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.977993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.978014 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:28 crc kubenswrapper[5094]: I0220 06:47:28.978028 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:28Z","lastTransitionTime":"2026-02-20T06:47:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081460 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081505 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.081525 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184845 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184926 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184949 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.184962 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287718 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287802 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287813 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287827 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.287838 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391195 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391247 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391269 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391297 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.391317 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495302 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495391 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495413 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495455 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.495478 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599471 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599500 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599536 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.599563 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.703925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.704032 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.704055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.704084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.704104 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808129 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808176 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808189 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808205 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.808217 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.825362 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 01:26:33.106320594 +0000 UTC Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.839808 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.839850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:29 crc kubenswrapper[5094]: E0220 06:47:29.840078 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:29 crc kubenswrapper[5094]: E0220 06:47:29.840283 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911404 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911426 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911457 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:29 crc kubenswrapper[5094]: I0220 06:47:29.911475 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:29Z","lastTransitionTime":"2026-02-20T06:47:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014552 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014629 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014655 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.014766 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118057 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118123 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118162 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.118187 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221352 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221425 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221445 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.221498 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327339 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327438 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327486 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.327506 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431386 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.431413 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.534908 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.535153 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.535221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.535288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.535348 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.637956 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.638006 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.638024 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.638049 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.638068 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741093 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741138 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741155 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741180 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.741197 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.825468 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 14:48:16.592112178 +0000 UTC Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.839476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:30 crc kubenswrapper[5094]: E0220 06:47:30.839637 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.839652 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:30 crc kubenswrapper[5094]: E0220 06:47:30.839790 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844663 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844791 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844857 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844871 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.844971 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948069 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948335 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948415 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948535 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:30 crc kubenswrapper[5094]: I0220 06:47:30.948658 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:30Z","lastTransitionTime":"2026-02-20T06:47:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051380 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051520 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051604 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.051734 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.154855 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.155135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.155207 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.155272 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.155337 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257574 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257627 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257638 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257652 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.257661 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360482 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360608 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.360625 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463033 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463456 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463578 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.463868 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566583 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566598 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566622 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.566636 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.669884 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.669954 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.669973 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.669999 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.670015 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.772932 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.772986 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.773001 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.773020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.773032 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.826254 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 16:04:39.538665497 +0000 UTC Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.839632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.839883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:31 crc kubenswrapper[5094]: E0220 06:47:31.839894 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:31 crc kubenswrapper[5094]: E0220 06:47:31.840417 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.874951 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.874991 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.875002 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.875021 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.875037 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.977993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.978070 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.978085 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.978112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:31 crc kubenswrapper[5094]: I0220 06:47:31.978131 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:31Z","lastTransitionTime":"2026-02-20T06:47:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081229 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081282 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081308 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.081318 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184131 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184246 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184260 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184284 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.184296 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.286575 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.286803 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.286880 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.286957 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.287016 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391617 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391766 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391795 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391829 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.391851 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.494909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.494966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.494986 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.495012 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.495028 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598670 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.598979 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702628 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702681 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702713 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.702725 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805734 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805762 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805772 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805785 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.805793 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.827206 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 16:21:05.073485004 +0000 UTC Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.839551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:32 crc kubenswrapper[5094]: E0220 06:47:32.839744 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.839859 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:32 crc kubenswrapper[5094]: E0220 06:47:32.839990 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910178 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910259 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910282 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910312 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:32 crc kubenswrapper[5094]: I0220 06:47:32.910334 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:32Z","lastTransitionTime":"2026-02-20T06:47:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013220 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013320 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013342 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.013397 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116790 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116867 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.116934 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220512 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220536 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220572 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.220598 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.321942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.322032 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.322051 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.322081 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.322102 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.344487 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351825 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351878 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351895 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.351941 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.374642 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380235 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380306 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380330 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380359 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.380380 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.400838 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407557 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407576 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407602 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.407621 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.428120 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432647 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432747 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432768 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432796 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.432814 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.455944 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:33Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.456168 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458318 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458371 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458412 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.458429 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561389 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561434 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561444 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.561476 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664824 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664909 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664937 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.664959 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.679760 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.679967 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.680042 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:05.680024617 +0000 UTC m=+100.552651328 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768109 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768131 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.768146 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.828282 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 06:03:06.919654293 +0000 UTC Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.839849 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.839939 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.840112 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:33 crc kubenswrapper[5094]: E0220 06:47:33.840369 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870872 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870908 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870921 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870937 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.870952 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972619 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972630 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972644 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:33 crc kubenswrapper[5094]: I0220 06:47:33.972655 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:33Z","lastTransitionTime":"2026-02-20T06:47:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075245 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075330 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075348 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075377 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.075393 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178304 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178323 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178357 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.178385 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281397 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281462 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281509 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.281527 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.384971 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.385058 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.385077 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.385105 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.385124 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488495 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488522 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.488581 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591453 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591513 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591531 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591556 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.591576 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694235 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694293 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694305 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.694339 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797562 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797641 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797686 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.797730 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.829190 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 04:20:05.561705888 +0000 UTC Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.839947 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:34 crc kubenswrapper[5094]: E0220 06:47:34.840229 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.840755 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:34 crc kubenswrapper[5094]: E0220 06:47:34.840912 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.903599 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.903687 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.903751 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.903972 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:34 crc kubenswrapper[5094]: I0220 06:47:34.904003 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:34Z","lastTransitionTime":"2026-02-20T06:47:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008127 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008148 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008186 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.008210 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111330 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111382 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.111430 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215367 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.215442 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318499 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318548 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318574 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.318585 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422169 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422227 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422246 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.422257 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524581 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524629 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524649 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.524661 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.627997 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.628050 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.628061 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.628077 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.628088 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731274 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731326 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731344 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731367 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.731386 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.829689 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 12:04:55.607346032 +0000 UTC Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834119 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834152 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.834163 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.839654 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:35 crc kubenswrapper[5094]: E0220 06:47:35.839829 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.840177 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:35 crc kubenswrapper[5094]: E0220 06:47:35.840291 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.853933 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.866243 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.884922 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.900517 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.919094 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937115 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.937195 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:35Z","lastTransitionTime":"2026-02-20T06:47:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.940930 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.972344 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:35 crc kubenswrapper[5094]: I0220 06:47:35.989167 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:35Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.013167 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.035676 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039223 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039478 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.039532 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.063525 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.080873 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.109516 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.132641 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.143623 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.143987 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.144112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.144589 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.144695 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.150945 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.164842 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.182255 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.199456 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.249935 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.249984 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.249998 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.250019 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.250033 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353611 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353661 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353674 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353693 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.353724 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.458950 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.459046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.459072 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.459102 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.459123 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.462398 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/0.log" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.462448 5094 generic.go:334] "Generic (PLEG): container finished" podID="c3900f6d-3035-4fc4-80a2-9e79154f4f5e" containerID="7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118" exitCode=1 Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.462479 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerDied","Data":"7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.462928 5094 scope.go:117] "RemoveContainer" containerID="7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.491594 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.508484 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.524407 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.547932 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.559394 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561834 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561917 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.561973 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.571953 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.587985 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.604083 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.618487 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.634081 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.654722 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664542 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664561 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.664572 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.674178 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.686134 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.704589 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.737446 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.756761 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770463 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770508 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770520 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770727 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.770742 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.782468 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.799270 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:36Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.830443 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 14:42:49.465754773 +0000 UTC Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.839868 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:36 crc kubenswrapper[5094]: E0220 06:47:36.839997 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.840072 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:36 crc kubenswrapper[5094]: E0220 06:47:36.840118 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.872948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.872977 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.872986 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.873004 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.873016 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975154 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975208 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975230 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:36 crc kubenswrapper[5094]: I0220 06:47:36.975239 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:36Z","lastTransitionTime":"2026-02-20T06:47:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077876 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077964 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.077976 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181091 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181101 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181121 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.181136 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284859 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284921 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284968 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.284987 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388653 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388757 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388770 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388790 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.388804 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.469144 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/0.log" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.469534 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.488637 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.491945 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.492035 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.492056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.492084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.492144 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.506131 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.520983 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.537450 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.550445 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.569840 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594829 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594876 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594886 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594902 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.594913 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.595411 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.629391 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.646475 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.664515 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.678200 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.692252 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700357 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700426 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700440 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700491 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.700507 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.706313 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.719183 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.739060 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.753589 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.769528 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.786563 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:37Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804042 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804098 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804135 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.804149 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.830656 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 04:01:48.293373748 +0000 UTC Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.839977 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.840041 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:37 crc kubenswrapper[5094]: E0220 06:47:37.840130 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:37 crc kubenswrapper[5094]: E0220 06:47:37.840391 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907758 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907858 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907879 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:37 crc kubenswrapper[5094]: I0220 06:47:37.907960 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:37Z","lastTransitionTime":"2026-02-20T06:47:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.010931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.010994 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.011011 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.011034 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.011050 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114659 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114722 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114736 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114759 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.114775 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218180 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218237 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218248 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218265 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.218277 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320303 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320374 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320416 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.320433 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.423910 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.423969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.423983 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.424005 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.424024 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527148 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527227 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527247 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527277 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.527299 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630517 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630594 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630614 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630652 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.630678 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734249 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734340 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.734393 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.831183 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 14:39:37.948913495 +0000 UTC Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838039 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838099 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838119 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838148 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.838172 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.839137 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.839245 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:38 crc kubenswrapper[5094]: E0220 06:47:38.839496 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:38 crc kubenswrapper[5094]: E0220 06:47:38.839663 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942185 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942208 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942239 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:38 crc kubenswrapper[5094]: I0220 06:47:38.942262 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:38Z","lastTransitionTime":"2026-02-20T06:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.044973 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.045056 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.045082 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.045112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.045134 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148764 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148838 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148859 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148887 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.148907 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252677 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252771 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252792 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252824 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.252844 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355868 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355923 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355970 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.355990 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459120 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.459189 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.562440 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.562926 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.563143 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.563314 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.563449 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667694 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667759 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667798 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.667821 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772447 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772509 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772526 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772554 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.772578 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.831449 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 13:00:41.577906335 +0000 UTC Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.839891 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.839929 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:39 crc kubenswrapper[5094]: E0220 06:47:39.840117 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:39 crc kubenswrapper[5094]: E0220 06:47:39.840272 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875337 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.875359 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.978943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.978993 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.979010 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.979035 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:39 crc kubenswrapper[5094]: I0220 06:47:39.979054 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:39Z","lastTransitionTime":"2026-02-20T06:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.082785 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.083308 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.083481 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.083621 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.083785 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188213 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188632 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188875 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188947 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.188973 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.292941 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.293018 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.293036 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.293067 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.293087 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396149 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396201 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396211 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.396244 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507771 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507844 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507860 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507906 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.507923 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611567 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611662 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611688 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611764 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.611793 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715551 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715639 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715666 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715737 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.715765 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819295 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819395 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.819464 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.833121 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 02:29:48.224222968 +0000 UTC Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.839478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.839568 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:40 crc kubenswrapper[5094]: E0220 06:47:40.839688 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:40 crc kubenswrapper[5094]: E0220 06:47:40.839828 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923474 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923535 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:40 crc kubenswrapper[5094]: I0220 06:47:40.923601 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:40Z","lastTransitionTime":"2026-02-20T06:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027408 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027478 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027497 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.027547 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131173 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131252 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131270 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131302 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.131322 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235359 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235420 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235438 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235466 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.235486 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340446 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340518 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340539 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340569 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.340590 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446480 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446541 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.446582 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549691 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549808 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549830 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549863 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.549887 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654104 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654174 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654197 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654229 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.654253 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757464 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757487 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757550 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.757576 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.833473 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 01:14:39.615440174 +0000 UTC Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.839963 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.839960 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:41 crc kubenswrapper[5094]: E0220 06:47:41.840160 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:41 crc kubenswrapper[5094]: E0220 06:47:41.840392 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.858746 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860248 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860329 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860381 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.860400 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963010 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963068 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963086 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963113 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:41 crc kubenswrapper[5094]: I0220 06:47:41.963133 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:41Z","lastTransitionTime":"2026-02-20T06:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067010 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067101 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067121 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067153 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.067172 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170294 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170355 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170397 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.170415 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274014 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274107 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.274156 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378110 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378189 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378215 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.378233 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481219 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481268 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481286 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481311 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.481332 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584927 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584946 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584970 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.584989 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688122 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688143 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688173 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.688191 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791669 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791792 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791812 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791841 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.791861 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.834225 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 14:10:08.856586607 +0000 UTC Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.839695 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:42 crc kubenswrapper[5094]: E0220 06:47:42.839916 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.839695 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:42 crc kubenswrapper[5094]: E0220 06:47:42.840132 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895335 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895421 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895454 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.895478 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.999255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.999345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.999368 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:42 crc kubenswrapper[5094]: I0220 06:47:42.999400 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:42.999445 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:42Z","lastTransitionTime":"2026-02-20T06:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103092 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103153 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103170 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103194 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.103211 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206181 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206257 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206280 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206312 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.206335 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310278 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310357 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310375 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310402 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.310428 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414562 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414667 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414686 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414747 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.414767 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518476 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518496 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518520 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.518538 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.621903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.621972 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.621990 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.622015 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.622036 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725787 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725870 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725890 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725916 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.725936 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.733871 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.733966 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.733996 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.734029 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.734067 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.758386 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764394 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764456 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764474 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764499 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.764517 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.784437 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790184 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790292 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790321 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790358 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.790381 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.812321 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.816835 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.816928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.816958 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.816995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.817037 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.834668 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 07:51:08.72345156 +0000 UTC Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.838424 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.839474 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.839792 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.840006 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.840179 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.840456 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843790 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843866 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843892 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843928 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.843953 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.867229 5094 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"6fb44c16-1595-44a7-b2ec-4faee6098a1e\\\",\\\"systemUUID\\\":\\\"d25915f7-4d55-43a4-a20b-9e6118746152\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:43Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:43 crc kubenswrapper[5094]: E0220 06:47:43.867493 5094 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871054 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871103 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871122 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871151 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.871169 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974656 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974676 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974739 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:43 crc kubenswrapper[5094]: I0220 06:47:43.974760 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:43Z","lastTransitionTime":"2026-02-20T06:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078737 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078779 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078790 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078810 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.078823 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182798 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182871 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182919 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.182938 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.289946 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.290029 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.290055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.290088 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.290120 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.392939 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.393001 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.393020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.393055 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.393077 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496211 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496230 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.496244 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.511926 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/2.log" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.517189 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.518138 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.545318 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.563681 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.598840 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602000 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602145 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602164 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602192 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.602214 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.633010 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.668427 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.687951 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.701034 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705241 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705290 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705308 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705334 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.705353 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.715389 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.727917 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.742264 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.757153 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.770507 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.785595 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.802584 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.807918 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.807997 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.808016 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.808087 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.808113 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.819384 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.833520 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.835616 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 00:49:16.600586907 +0000 UTC Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.839742 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.839786 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:44 crc kubenswrapper[5094]: E0220 06:47:44.839945 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:44 crc kubenswrapper[5094]: E0220 06:47:44.840078 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.853241 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.868860 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bc40232-e29d-4f13-a2f2-f8be2e97b789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.889480 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912059 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912242 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:44 crc kubenswrapper[5094]: I0220 06:47:44.912266 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:44Z","lastTransitionTime":"2026-02-20T06:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015338 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015401 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015419 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015446 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.015465 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119640 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119746 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.119776 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223227 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223300 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223318 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223348 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.223369 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327624 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327732 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327754 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327780 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.327799 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431024 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431110 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431123 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431144 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.431156 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.525497 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.526888 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/2.log" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.531584 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" exitCode=1 Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.531648 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.531755 5094 scope.go:117] "RemoveContainer" containerID="210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.533467 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:47:45 crc kubenswrapper[5094]: E0220 06:47:45.533766 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534066 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534098 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534128 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.534141 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.553080 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bc40232-e29d-4f13-a2f2-f8be2e97b789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.577148 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.599956 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.620369 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.637943 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.637997 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.638010 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.638037 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.638058 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.643007 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.688051 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.708397 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.729604 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.741830 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.741936 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.741965 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.742003 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.742029 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.752756 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.786574 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:44Z\\\",\\\"message\\\":\\\"work controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z]\\\\nI0220 06:47:44.923774 7201 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f\\\\nI0220 06:47:44.923771 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-9vd4p\\\\nI0220 06:47:44.923661 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0220 06:47:44.923695 7201 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.810047 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.829553 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.836666 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 04:55:16.376232155 +0000 UTC Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.839649 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.839779 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:45 crc kubenswrapper[5094]: E0220 06:47:45.839881 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:45 crc kubenswrapper[5094]: E0220 06:47:45.839987 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846190 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846257 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846287 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846320 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.846343 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.855665 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.879064 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.907329 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.929875 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.948174 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950446 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950506 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950564 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.950586 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:45Z","lastTransitionTime":"2026-02-20T06:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.973893 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:45 crc kubenswrapper[5094]: I0220 06:47:45.995270 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:45Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.028159 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.046356 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055045 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055134 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055164 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.055222 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.066852 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.087444 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.117178 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://210333557a16bd86d13dc95fd945e83d39c340eb81d6da9e8526864f3987c2ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:12Z\\\",\\\"message\\\":\\\"ddLogicalPort failed for openshift-multus/network-metrics-daemon-8ww4n: failed to update pod openshift-multus/network-metrics-daemon-8ww4n: Internal error occurred: failed calling webhook \\\\\\\"pod.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/pod?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:12Z is after 2025-08-24T17:21:41Z\\\\nI0220 06:47:12.918393 6814 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0220 06:47:12.918801 6814 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0220 06:47:12.918936 6814 handler.go:208] Removed *v1.Node event handler 2\\\\nI0220 06:47:12.918990 6814 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0220 06:47:12.919228 6814 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0220 06:47:12.919287 6814 factory.go:656] Stopping watch factory\\\\nI0220 06:47:12.919306 6814 ovnkube.go:599] Stopped ovnkube\\\\nI0220 06:47:12.919342 6814 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0220 06:47:12.919361 6814 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0220 06:47:12.919443 6814 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:44Z\\\",\\\"message\\\":\\\"work controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z]\\\\nI0220 06:47:44.923774 7201 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f\\\\nI0220 06:47:44.923771 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-9vd4p\\\\nI0220 06:47:44.923661 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0220 06:47:44.923695 7201 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.137504 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.152904 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159222 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159295 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159319 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159356 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.159379 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.172357 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.195645 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.220550 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.241834 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.260199 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262023 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262105 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262125 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262162 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.262182 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.278565 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.304655 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.325618 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bc40232-e29d-4f13-a2f2-f8be2e97b789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.350317 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372303 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372354 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372371 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372397 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372422 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.372589 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.389365 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.409547 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475400 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475417 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.475459 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.538514 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.545080 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:47:46 crc kubenswrapper[5094]: E0220 06:47:46.545357 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.567777 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef5d58f2f3cf38551a2512bf04ca053bf33bb03c574e368d0578cb55970fee5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985701235b8cdb1d8ea577d23dd152401e924fac8d48a3987bc3a57b60b4e012\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.579112 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.579322 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.579505 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.579737 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.580153 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.584383 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-8wch6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bc82500-7462-4daa-9eff-116399acb06a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4efe81ccb64780c6eefa058ce0f9926ba2cf555443c372bec62b7fabdc485500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s6p6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-8wch6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.601306 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-qzxk2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81c0f95-7b6e-4a44-8115-f517fc8f4052\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5bf075f70a136b61df510e41730b9f5ce5b303f19801d65d6b5a5b6633e1e855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr5sr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-qzxk2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.622124 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"19cce34f-67a6-48c9-a396-621c5811b6cd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a1cae8ebad449279afbb8df554186c08166b7b0426884d14a166a7899e88c307\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://711c69b88d39f3974629b281ff3f5e5cb15aa4c3d5368e758574f7b031f3242d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2fcd4912e92ef50323b24820dae0f6be7b6e93b175bd067d1ffdbb061017249d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://844f756cfa239357db249bf5e1f58e5de5c5625d9d0789e78d0287f8b40d6776\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c30b15549ca7aeae09f141a22851ce3ebfc4670f00b9b0cd48332ef35aa49ddd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc766f3f85db8533aea4bc7e071236a2bea9ea9f03b9ac0bb8022167aade42f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://817c3f814b3dca55f8b0d84f27776351f05326870cf3ba2510fcb7c972d5608d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cjkf4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-9vd4p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.638463 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da0aa093-1adc-45f2-a942-e68d7be23ed4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mhrtm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8ww4n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.658061 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"089ad2b1-a8b5-4a97-ad0e-a7912d97c2b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b3f535bdf7005ade8abbf6f234ddbc9c15136792c8ebe8a3646e9dadedea986\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d72e26a84d5625d799baf5fcd573d245475eb954a13456bf1813c0c863dc5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d40e9d51da95e9023d99a2b2cdc4aa1a6d6755d0110393e173ee57fe9bfb74ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b846ca972489d3cf59f60d233d36eef949caf193a933bb56cb28cef9dd8dc9d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.677211 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aedf01bc2d73d8aa5ed311d59452e7da91c0700587c8672d9903327dbb1c4e5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.682900 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.682948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.682969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.682996 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.683015 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.697541 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be358cb7-2257-4d5b-82e5-797ca3be5957\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cf6db0110f37db947a8c8f6671c667c015081e33773d22fdf23b5842fc11b38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df67a13b4586be6d4227c3a5e32d5b8ff3e2a1f1b7fe57f68f8b4f2cf38752a1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b64559435a0d02d01852a3e8cd809e11e33b8077f43fcdd038b9624675dd2d81\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.716337 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a22d6bbb7e4f9f7112e086a7a164067edb411398aa08955d815327988c5112b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hzjnr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-56ppq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.739441 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-zr8rz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3900f6d-3035-4fc4-80a2-9e79154f4f5e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:36Z\\\",\\\"message\\\":\\\"2026-02-20T06:46:50+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb\\\\n2026-02-20T06:46:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f43e851d-1502-4b62-bdc2-ba27d8b6a1eb to /host/opt/cni/bin/\\\\n2026-02-20T06:46:51Z [verbose] multus-daemon started\\\\n2026-02-20T06:46:51Z [verbose] Readiness Indicator file check\\\\n2026-02-20T06:47:36Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8fphl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-zr8rz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.757020 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1bc40232-e29d-4f13-a2f2-f8be2e97b789\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7bbeadd569a4f1d9c8ec473e4dce3e5141ef23e49e911237ea9637cb3bc0fb77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b984b4cdf6ed9e2cbbeba5300d04d9aabca1c5deb777d5f2d1f92c56488baccf\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.779690 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069b1776-8adf-4339-bde2-43375d702571\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-20T06:46:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0220 06:46:39.803386 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0220 06:46:39.804142 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3469391808/tls.crt::/tmp/serving-cert-3469391808/tls.key\\\\\\\"\\\\nI0220 06:46:45.699605 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0220 06:46:45.707137 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0220 06:46:45.707252 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0220 06:46:45.707391 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0220 06:46:45.707452 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0220 06:46:45.717533 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0220 06:46:45.717591 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717602 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0220 06:46:45.717646 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0220 06:46:45.717656 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0220 06:46:45.717663 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0220 06:46:45.717670 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0220 06:46:45.717748 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0220 06:46:45.719578 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786406 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786500 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786524 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786558 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.786582 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.800419 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:51Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c17099a4476f4191682e7c93c0283d12a1357f5c2bbd04aae37fb2f197cb576c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.819524 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.837291 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 20:44:45.665010916 +0000 UTC Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.839746 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.839787 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:46 crc kubenswrapper[5094]: E0220 06:47:46.839986 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:46 crc kubenswrapper[5094]: E0220 06:47:46.840100 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.855619 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-20T06:47:44Z\\\",\\\"message\\\":\\\"work controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:44Z is after 2025-08-24T17:21:41Z]\\\\nI0220 06:47:44.923774 7201 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f\\\\nI0220 06:47:44.923771 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-9vd4p\\\\nI0220 06:47:44.923661 7201 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0220 06:47:44.923695 7201 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-console/console]} name:Service_opensh\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-20T06:47:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-swnw6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-29bjc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.895425 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5b89f0b9-352e-4f11-aec0-c0fef754cf64\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://824c5dd8d86d92c1af4980cba91caf7f85ce197f403a03977b0d90dcbef6645b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa17eb7de956beeccbd19e641c15d2f6dcc02121abfa5882b26fc9632e04996f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f89c5b0fb99427e41b6d023957810dd43121cade5cfafc9ad892540cf019c4d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe4cabeefa316cc217f93584dfb56b9389340ef8645f70cf99e5234122fe8568\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ee2be1f8df266e23fda65e44fa0542897f4fbde5b348083acf230fc054915aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:46:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8195a9219256255882fd4c5729a097b51e764209efef7c0e2d80bb72c0362155\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://457420514ec33e5efc80b93e92f29748f198e0c010621486276b577aec682378\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35769c5d23689b9c9066acf327e145d48e82b92912a2f3768949b3b223dc673c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-20T06:46:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-20T06:46:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:46:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.897967 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.898052 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.898081 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.898116 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.898140 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:46Z","lastTransitionTime":"2026-02-20T06:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.920505 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.942364 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-20T06:46:46Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:46 crc kubenswrapper[5094]: I0220 06:47:46.959901 5094 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"60f18419-2e46-4911-bceb-d8651c9fac66\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-20T06:47:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c561efe0bdf3fc9b35bebb6f356c9ef56add69a863637766aab3d3748acaaa63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8f6830864b9f0c23a91dff26fca798b670afe7c0316ae71ad386c027b8ce0bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-20T06:47:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xr64j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-20T06:47:01Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qrs4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-20T06:47:46Z is after 2025-08-24T17:21:41Z" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002619 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002689 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002740 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002770 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.002791 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.106830 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.106931 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.106959 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.106999 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.107027 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211160 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211237 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211259 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.211306 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315274 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315350 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315372 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315400 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.315423 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420555 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420688 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420744 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.420766 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524508 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524651 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524681 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524772 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.524805 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628665 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628783 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628804 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628851 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.628870 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731198 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731246 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731254 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731271 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.731284 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834486 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834532 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834541 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834559 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.834570 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.837696 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 20:26:40.72872953 +0000 UTC Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.840009 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:47 crc kubenswrapper[5094]: E0220 06:47:47.840181 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.840937 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:47 crc kubenswrapper[5094]: E0220 06:47:47.841049 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937897 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937942 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937957 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937980 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:47 crc kubenswrapper[5094]: I0220 06:47:47.937995 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:47Z","lastTransitionTime":"2026-02-20T06:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.041592 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.041991 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.042120 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.042255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.042376 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145504 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145588 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145605 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145633 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.145654 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248594 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248657 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248671 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248693 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.248728 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352424 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352503 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352519 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352540 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.352555 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456565 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456616 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.456637 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560227 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560298 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560318 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560347 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.560367 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664142 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664163 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664218 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.664237 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.767672 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.768250 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.768458 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.768630 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.768820 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.838292 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 01:01:22.379918536 +0000 UTC Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.839257 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.839257 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:48 crc kubenswrapper[5094]: E0220 06:47:48.839475 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:48 crc kubenswrapper[5094]: E0220 06:47:48.839582 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873154 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873221 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873240 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873266 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.873288 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977224 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977315 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977370 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:48 crc kubenswrapper[5094]: I0220 06:47:48.977395 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:48Z","lastTransitionTime":"2026-02-20T06:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088283 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088414 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088441 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088479 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.088517 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191811 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191898 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191918 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191948 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.191968 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295111 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295130 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295158 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.295181 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399530 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399598 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399617 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399643 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.399663 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503392 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503463 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503484 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503511 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.503531 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607071 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607177 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607207 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.607230 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.710598 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.711111 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.711306 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.711658 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.711867 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.814856 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.814926 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.814946 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.814982 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.815006 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.839530 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 08:26:42.681831623 +0000 UTC Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.839856 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.839850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:49 crc kubenswrapper[5094]: E0220 06:47:49.840051 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:49 crc kubenswrapper[5094]: E0220 06:47:49.840268 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922778 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922878 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922903 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922941 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:49 crc kubenswrapper[5094]: I0220 06:47:49.922972 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:49Z","lastTransitionTime":"2026-02-20T06:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027003 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027076 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027098 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027128 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.027151 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130679 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130782 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130802 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130833 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.130853 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234230 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234323 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234345 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234373 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.234392 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337908 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337955 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337965 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337985 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.337995 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440411 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440478 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440500 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.440546 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544292 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544366 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544393 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544425 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.544453 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647157 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647210 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647219 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647236 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.647247 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751130 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751191 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751209 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751234 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.751253 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.839737 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 10:28:24.201652965 +0000 UTC Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.839868 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.839921 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.840153 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.840288 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855258 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855311 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855333 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855360 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.855380 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880204 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880176045 +0000 UTC m=+149.752802796 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880255 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880314 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880391 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.880431 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880518 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880542 5094 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880553 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880573 5094 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880616 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880597495 +0000 UTC m=+149.753224246 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880618 5094 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880642 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880629676 +0000 UTC m=+149.753256427 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880688 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880769 5094 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880791 5094 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880812 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880769949 +0000 UTC m=+149.753396690 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 20 06:47:50 crc kubenswrapper[5094]: E0220 06:47:50.880880 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.880845962 +0000 UTC m=+149.753472873 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958509 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958579 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958598 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958626 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:50 crc kubenswrapper[5094]: I0220 06:47:50.958645 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:50Z","lastTransitionTime":"2026-02-20T06:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062501 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062587 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062608 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062641 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.062659 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166504 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166523 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166550 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.166569 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270508 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270528 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270560 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.270577 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374442 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374527 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374550 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374591 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.374614 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.478947 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.479027 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.479046 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.479072 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.479090 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587233 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587307 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587324 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587348 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.587367 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690240 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690304 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690323 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690349 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.690369 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.793673 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.794176 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.794193 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.794216 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.794233 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.840008 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 19:30:00.332389212 +0000 UTC Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.840213 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:51 crc kubenswrapper[5094]: E0220 06:47:51.840448 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.840579 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:51 crc kubenswrapper[5094]: E0220 06:47:51.840960 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.896950 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.897020 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.897038 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.897069 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:51 crc kubenswrapper[5094]: I0220 06:47:51.897090 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:51Z","lastTransitionTime":"2026-02-20T06:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000632 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000742 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000770 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000806 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.000831 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105477 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105536 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105553 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105578 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.105595 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208793 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208836 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208845 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208860 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.208870 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313222 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313286 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313303 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313335 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.313355 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417051 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417116 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417139 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417172 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.417196 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.520916 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.520995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.521013 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.521049 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.521073 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.623889 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.623952 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.623969 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.623995 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.624017 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727407 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727472 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727493 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727521 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.727539 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830695 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830795 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830812 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830841 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.830860 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.839297 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.839348 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:52 crc kubenswrapper[5094]: E0220 06:47:52.839649 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:52 crc kubenswrapper[5094]: E0220 06:47:52.839787 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.840234 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 23:28:45.883416792 +0000 UTC Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934785 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934840 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934856 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934882 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:52 crc kubenswrapper[5094]: I0220 06:47:52.934900 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:52Z","lastTransitionTime":"2026-02-20T06:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038575 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038646 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038668 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038742 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.038762 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141544 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141612 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141631 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141658 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.141677 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245288 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245351 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245362 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245390 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.245402 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.349869 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.349955 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.349979 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.350011 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.350035 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453276 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453364 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453383 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453432 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.453457 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556431 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556483 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556493 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556513 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.556525 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660400 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660478 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660502 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660534 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.660557 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763691 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763870 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763904 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763938 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.763969 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.839796 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:53 crc kubenswrapper[5094]: E0220 06:47:53.839991 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.840115 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:53 crc kubenswrapper[5094]: E0220 06:47:53.840290 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.840362 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 14:57:48.951556214 +0000 UTC Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867084 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867141 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867159 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867190 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.867213 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969892 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969925 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969934 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969949 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:53 crc kubenswrapper[5094]: I0220 06:47:53.969959 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:53Z","lastTransitionTime":"2026-02-20T06:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030199 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030232 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030241 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030255 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.030264 5094 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-20T06:47:54Z","lastTransitionTime":"2026-02-20T06:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.096919 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx"] Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.097370 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.099803 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.100717 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.101366 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.102784 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.133318 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8wch6" podStartSLOduration=68.133286233 podStartE2EDuration="1m8.133286233s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.118670336 +0000 UTC m=+88.991297087" watchObservedRunningTime="2026-02-20 06:47:54.133286233 +0000 UTC m=+89.005912974" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.133537 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qzxk2" podStartSLOduration=68.13352935 podStartE2EDuration="1m8.13352935s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.133090298 +0000 UTC m=+89.005717009" watchObservedRunningTime="2026-02-20 06:47:54.13352935 +0000 UTC m=+89.006156091" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.162195 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9vd4p" podStartSLOduration=67.162173001 podStartE2EDuration="1m7.162173001s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.161863034 +0000 UTC m=+89.034489755" watchObservedRunningTime="2026-02-20 06:47:54.162173001 +0000 UTC m=+89.034799752" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.200472 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.200443572 podStartE2EDuration="38.200443572s" podCreationTimestamp="2026-02-20 06:47:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.200227507 +0000 UTC m=+89.072854278" watchObservedRunningTime="2026-02-20 06:47:54.200443572 +0000 UTC m=+89.073070293" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.219969 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e83799ff-1272-441c-87ec-74034bf3622c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.220034 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e83799ff-1272-441c-87ec-74034bf3622c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.220337 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.220425 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e83799ff-1272-441c-87ec-74034bf3622c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.220476 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.257967 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podStartSLOduration=67.257930301 podStartE2EDuration="1m7.257930301s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.25707986 +0000 UTC m=+89.129706611" watchObservedRunningTime="2026-02-20 06:47:54.257930301 +0000 UTC m=+89.130557052" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.294626 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zr8rz" podStartSLOduration=67.294598754 podStartE2EDuration="1m7.294598754s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.294102901 +0000 UTC m=+89.166729632" watchObservedRunningTime="2026-02-20 06:47:54.294598754 +0000 UTC m=+89.167225475" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.321800 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e83799ff-1272-441c-87ec-74034bf3622c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322146 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322280 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e83799ff-1272-441c-87ec-74034bf3622c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322401 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e83799ff-1272-441c-87ec-74034bf3622c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322551 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322290 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.322677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e83799ff-1272-441c-87ec-74034bf3622c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.323517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e83799ff-1272-441c-87ec-74034bf3622c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.329890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e83799ff-1272-441c-87ec-74034bf3622c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.330190 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=13.33016565 podStartE2EDuration="13.33016565s" podCreationTimestamp="2026-02-20 06:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.329214697 +0000 UTC m=+89.201841418" watchObservedRunningTime="2026-02-20 06:47:54.33016565 +0000 UTC m=+89.202792371" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.347063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e83799ff-1272-441c-87ec-74034bf3622c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-sd2hx\" (UID: \"e83799ff-1272-441c-87ec-74034bf3622c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.362974 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.362950531 podStartE2EDuration="1m9.362950531s" podCreationTimestamp="2026-02-20 06:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.362832377 +0000 UTC m=+89.235459088" watchObservedRunningTime="2026-02-20 06:47:54.362950531 +0000 UTC m=+89.235577242" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.404399 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.404375487 podStartE2EDuration="1m7.404375487s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.383053359 +0000 UTC m=+89.255680070" watchObservedRunningTime="2026-02-20 06:47:54.404375487 +0000 UTC m=+89.277002198" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.419061 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.466097 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=67.466071315 podStartE2EDuration="1m7.466071315s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.464816695 +0000 UTC m=+89.337443406" watchObservedRunningTime="2026-02-20 06:47:54.466071315 +0000 UTC m=+89.338698026" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.542247 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qrs4f" podStartSLOduration=67.542224688 podStartE2EDuration="1m7.542224688s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.527123119 +0000 UTC m=+89.399749830" watchObservedRunningTime="2026-02-20 06:47:54.542224688 +0000 UTC m=+89.414851399" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.579505 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" event={"ID":"e83799ff-1272-441c-87ec-74034bf3622c","Type":"ContainerStarted","Data":"d051043d04fa6ae5988a43bf0012be5faab49c0b8d99c50d97f7d08c282a75d6"} Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.579577 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" event={"ID":"e83799ff-1272-441c-87ec-74034bf3622c","Type":"ContainerStarted","Data":"9dbcd7bdd8f844492425ae1e0ca03888fde22dfbcb776a0a71b53da50e068ccc"} Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.595887 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-sd2hx" podStartSLOduration=67.595860275 podStartE2EDuration="1m7.595860275s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:47:54.595685851 +0000 UTC m=+89.468312602" watchObservedRunningTime="2026-02-20 06:47:54.595860275 +0000 UTC m=+89.468487006" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.839671 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.839671 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:54 crc kubenswrapper[5094]: E0220 06:47:54.839930 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:54 crc kubenswrapper[5094]: E0220 06:47:54.840108 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.840786 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 06:19:34.876721117 +0000 UTC Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.840879 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 20 06:47:54 crc kubenswrapper[5094]: I0220 06:47:54.853118 5094 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 20 06:47:55 crc kubenswrapper[5094]: I0220 06:47:55.839243 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:55 crc kubenswrapper[5094]: I0220 06:47:55.839469 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:55 crc kubenswrapper[5094]: E0220 06:47:55.841296 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:55 crc kubenswrapper[5094]: E0220 06:47:55.841686 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:56 crc kubenswrapper[5094]: I0220 06:47:56.839430 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:56 crc kubenswrapper[5094]: I0220 06:47:56.839482 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:56 crc kubenswrapper[5094]: E0220 06:47:56.839658 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:56 crc kubenswrapper[5094]: E0220 06:47:56.839836 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:57 crc kubenswrapper[5094]: I0220 06:47:57.843081 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:57 crc kubenswrapper[5094]: E0220 06:47:57.843331 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:57 crc kubenswrapper[5094]: I0220 06:47:57.843602 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:57 crc kubenswrapper[5094]: E0220 06:47:57.843801 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:58 crc kubenswrapper[5094]: I0220 06:47:58.839319 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:47:58 crc kubenswrapper[5094]: I0220 06:47:58.839496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:47:58 crc kubenswrapper[5094]: E0220 06:47:58.840084 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:47:58 crc kubenswrapper[5094]: E0220 06:47:58.840401 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:47:59 crc kubenswrapper[5094]: I0220 06:47:59.840118 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:47:59 crc kubenswrapper[5094]: I0220 06:47:59.840248 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:47:59 crc kubenswrapper[5094]: E0220 06:47:59.840357 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:47:59 crc kubenswrapper[5094]: E0220 06:47:59.841145 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:47:59 crc kubenswrapper[5094]: I0220 06:47:59.841762 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:47:59 crc kubenswrapper[5094]: E0220 06:47:59.842046 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:48:00 crc kubenswrapper[5094]: I0220 06:48:00.839326 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:00 crc kubenswrapper[5094]: I0220 06:48:00.839342 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:00 crc kubenswrapper[5094]: E0220 06:48:00.840032 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:00 crc kubenswrapper[5094]: E0220 06:48:00.840171 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:01 crc kubenswrapper[5094]: I0220 06:48:01.839968 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:01 crc kubenswrapper[5094]: E0220 06:48:01.840138 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:01 crc kubenswrapper[5094]: I0220 06:48:01.842507 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:01 crc kubenswrapper[5094]: E0220 06:48:01.842602 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:02 crc kubenswrapper[5094]: I0220 06:48:02.839566 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:02 crc kubenswrapper[5094]: I0220 06:48:02.839671 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:02 crc kubenswrapper[5094]: E0220 06:48:02.839977 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:02 crc kubenswrapper[5094]: E0220 06:48:02.840151 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:03 crc kubenswrapper[5094]: I0220 06:48:03.839845 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:03 crc kubenswrapper[5094]: I0220 06:48:03.839873 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:03 crc kubenswrapper[5094]: E0220 06:48:03.840138 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:03 crc kubenswrapper[5094]: E0220 06:48:03.840290 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:04 crc kubenswrapper[5094]: I0220 06:48:04.839761 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:04 crc kubenswrapper[5094]: I0220 06:48:04.839791 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:04 crc kubenswrapper[5094]: E0220 06:48:04.839967 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:04 crc kubenswrapper[5094]: E0220 06:48:04.840057 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:05 crc kubenswrapper[5094]: I0220 06:48:05.775163 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:05 crc kubenswrapper[5094]: E0220 06:48:05.775461 5094 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:48:05 crc kubenswrapper[5094]: E0220 06:48:05.775593 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs podName:da0aa093-1adc-45f2-a942-e68d7be23ed4 nodeName:}" failed. No retries permitted until 2026-02-20 06:49:09.775559098 +0000 UTC m=+164.648185839 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs") pod "network-metrics-daemon-8ww4n" (UID: "da0aa093-1adc-45f2-a942-e68d7be23ed4") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 20 06:48:05 crc kubenswrapper[5094]: I0220 06:48:05.839185 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:05 crc kubenswrapper[5094]: I0220 06:48:05.839290 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:05 crc kubenswrapper[5094]: E0220 06:48:05.841149 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:05 crc kubenswrapper[5094]: E0220 06:48:05.841423 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:06 crc kubenswrapper[5094]: I0220 06:48:06.839811 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:06 crc kubenswrapper[5094]: I0220 06:48:06.839856 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:06 crc kubenswrapper[5094]: E0220 06:48:06.840023 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:06 crc kubenswrapper[5094]: E0220 06:48:06.840187 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:07 crc kubenswrapper[5094]: I0220 06:48:07.840100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:07 crc kubenswrapper[5094]: E0220 06:48:07.840468 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:07 crc kubenswrapper[5094]: I0220 06:48:07.840942 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:07 crc kubenswrapper[5094]: E0220 06:48:07.841077 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:08 crc kubenswrapper[5094]: I0220 06:48:08.839322 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:08 crc kubenswrapper[5094]: I0220 06:48:08.839322 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:08 crc kubenswrapper[5094]: E0220 06:48:08.839596 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:08 crc kubenswrapper[5094]: E0220 06:48:08.839787 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:09 crc kubenswrapper[5094]: I0220 06:48:09.839373 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:09 crc kubenswrapper[5094]: I0220 06:48:09.839589 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:09 crc kubenswrapper[5094]: E0220 06:48:09.839850 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:09 crc kubenswrapper[5094]: E0220 06:48:09.840102 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:10 crc kubenswrapper[5094]: I0220 06:48:10.839284 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:10 crc kubenswrapper[5094]: I0220 06:48:10.839588 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:10 crc kubenswrapper[5094]: E0220 06:48:10.840191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:10 crc kubenswrapper[5094]: E0220 06:48:10.840333 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:11 crc kubenswrapper[5094]: I0220 06:48:11.840228 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:11 crc kubenswrapper[5094]: I0220 06:48:11.840294 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:11 crc kubenswrapper[5094]: E0220 06:48:11.840454 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:11 crc kubenswrapper[5094]: E0220 06:48:11.840557 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:12 crc kubenswrapper[5094]: I0220 06:48:12.840002 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:12 crc kubenswrapper[5094]: I0220 06:48:12.840271 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:12 crc kubenswrapper[5094]: E0220 06:48:12.840444 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:12 crc kubenswrapper[5094]: E0220 06:48:12.840634 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:13 crc kubenswrapper[5094]: I0220 06:48:13.839652 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:13 crc kubenswrapper[5094]: I0220 06:48:13.839655 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:13 crc kubenswrapper[5094]: E0220 06:48:13.839914 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:13 crc kubenswrapper[5094]: E0220 06:48:13.840160 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:14 crc kubenswrapper[5094]: I0220 06:48:14.840027 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:14 crc kubenswrapper[5094]: I0220 06:48:14.840099 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:14 crc kubenswrapper[5094]: E0220 06:48:14.840323 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:14 crc kubenswrapper[5094]: E0220 06:48:14.840982 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:14 crc kubenswrapper[5094]: I0220 06:48:14.841416 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:48:14 crc kubenswrapper[5094]: E0220 06:48:14.841678 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-29bjc_openshift-ovn-kubernetes(d1c36de3-d36b-48ed-9d4d-3aa52d72add0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" Feb 20 06:48:15 crc kubenswrapper[5094]: I0220 06:48:15.839925 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:15 crc kubenswrapper[5094]: I0220 06:48:15.842973 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:15 crc kubenswrapper[5094]: E0220 06:48:15.842940 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:15 crc kubenswrapper[5094]: E0220 06:48:15.843411 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:16 crc kubenswrapper[5094]: I0220 06:48:16.840271 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:16 crc kubenswrapper[5094]: I0220 06:48:16.840485 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:16 crc kubenswrapper[5094]: E0220 06:48:16.840637 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:16 crc kubenswrapper[5094]: E0220 06:48:16.841007 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:17 crc kubenswrapper[5094]: I0220 06:48:17.839235 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:17 crc kubenswrapper[5094]: I0220 06:48:17.839298 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:17 crc kubenswrapper[5094]: E0220 06:48:17.839494 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:17 crc kubenswrapper[5094]: E0220 06:48:17.839636 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:18 crc kubenswrapper[5094]: I0220 06:48:18.839510 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:18 crc kubenswrapper[5094]: I0220 06:48:18.839510 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:18 crc kubenswrapper[5094]: E0220 06:48:18.839784 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:18 crc kubenswrapper[5094]: E0220 06:48:18.839851 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:19 crc kubenswrapper[5094]: I0220 06:48:19.839409 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:19 crc kubenswrapper[5094]: I0220 06:48:19.839451 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:19 crc kubenswrapper[5094]: E0220 06:48:19.839792 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:19 crc kubenswrapper[5094]: E0220 06:48:19.840825 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:20 crc kubenswrapper[5094]: I0220 06:48:20.840181 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:20 crc kubenswrapper[5094]: I0220 06:48:20.840446 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:20 crc kubenswrapper[5094]: E0220 06:48:20.840642 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:20 crc kubenswrapper[5094]: E0220 06:48:20.840840 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:21 crc kubenswrapper[5094]: I0220 06:48:21.839247 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:21 crc kubenswrapper[5094]: E0220 06:48:21.839457 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:21 crc kubenswrapper[5094]: I0220 06:48:21.839916 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:21 crc kubenswrapper[5094]: E0220 06:48:21.840167 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.700168 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.701492 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/0.log" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.701570 5094 generic.go:334] "Generic (PLEG): container finished" podID="c3900f6d-3035-4fc4-80a2-9e79154f4f5e" containerID="aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79" exitCode=1 Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.701636 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerDied","Data":"aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79"} Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.701755 5094 scope.go:117] "RemoveContainer" containerID="7fbf710d381ddeb30941ad14158b7b81924e487b5179cde562f01058c7549118" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.702584 5094 scope.go:117] "RemoveContainer" containerID="aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79" Feb 20 06:48:22 crc kubenswrapper[5094]: E0220 06:48:22.703047 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-zr8rz_openshift-multus(c3900f6d-3035-4fc4-80a2-9e79154f4f5e)\"" pod="openshift-multus/multus-zr8rz" podUID="c3900f6d-3035-4fc4-80a2-9e79154f4f5e" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.839907 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:22 crc kubenswrapper[5094]: I0220 06:48:22.840208 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:22 crc kubenswrapper[5094]: E0220 06:48:22.840293 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:22 crc kubenswrapper[5094]: E0220 06:48:22.840827 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:23 crc kubenswrapper[5094]: I0220 06:48:23.708940 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 06:48:23 crc kubenswrapper[5094]: I0220 06:48:23.840108 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:23 crc kubenswrapper[5094]: I0220 06:48:23.840239 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:23 crc kubenswrapper[5094]: E0220 06:48:23.840504 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:23 crc kubenswrapper[5094]: E0220 06:48:23.840764 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:24 crc kubenswrapper[5094]: I0220 06:48:24.839824 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:24 crc kubenswrapper[5094]: I0220 06:48:24.839989 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:24 crc kubenswrapper[5094]: E0220 06:48:24.840009 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:24 crc kubenswrapper[5094]: E0220 06:48:24.840307 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:25 crc kubenswrapper[5094]: I0220 06:48:25.842806 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:25 crc kubenswrapper[5094]: I0220 06:48:25.842858 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:25 crc kubenswrapper[5094]: E0220 06:48:25.843101 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:25 crc kubenswrapper[5094]: E0220 06:48:25.843347 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:25 crc kubenswrapper[5094]: I0220 06:48:25.843410 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 06:48:25 crc kubenswrapper[5094]: E0220 06:48:25.865774 5094 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 20 06:48:25 crc kubenswrapper[5094]: E0220 06:48:25.975771 5094 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.727041 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.731396 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerStarted","Data":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.732021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.783780 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podStartSLOduration=99.783753336 podStartE2EDuration="1m39.783753336s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:26.782740122 +0000 UTC m=+121.655366833" watchObservedRunningTime="2026-02-20 06:48:26.783753336 +0000 UTC m=+121.656380057" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.840366 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:26 crc kubenswrapper[5094]: I0220 06:48:26.840397 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:26 crc kubenswrapper[5094]: E0220 06:48:26.840594 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:26 crc kubenswrapper[5094]: E0220 06:48:26.840909 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:27 crc kubenswrapper[5094]: I0220 06:48:27.040690 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8ww4n"] Feb 20 06:48:27 crc kubenswrapper[5094]: I0220 06:48:27.040966 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:27 crc kubenswrapper[5094]: E0220 06:48:27.041132 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:27 crc kubenswrapper[5094]: I0220 06:48:27.839118 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:27 crc kubenswrapper[5094]: E0220 06:48:27.839259 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:28 crc kubenswrapper[5094]: I0220 06:48:28.839241 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:28 crc kubenswrapper[5094]: I0220 06:48:28.839244 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:28 crc kubenswrapper[5094]: I0220 06:48:28.839347 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:28 crc kubenswrapper[5094]: E0220 06:48:28.839935 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:28 crc kubenswrapper[5094]: E0220 06:48:28.840118 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:28 crc kubenswrapper[5094]: E0220 06:48:28.840453 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:29 crc kubenswrapper[5094]: I0220 06:48:29.839845 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:29 crc kubenswrapper[5094]: E0220 06:48:29.840061 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:30 crc kubenswrapper[5094]: I0220 06:48:30.840099 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:30 crc kubenswrapper[5094]: I0220 06:48:30.840170 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:30 crc kubenswrapper[5094]: I0220 06:48:30.840234 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:30 crc kubenswrapper[5094]: E0220 06:48:30.840310 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:30 crc kubenswrapper[5094]: E0220 06:48:30.840477 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:30 crc kubenswrapper[5094]: E0220 06:48:30.840579 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:30 crc kubenswrapper[5094]: E0220 06:48:30.977625 5094 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 06:48:31 crc kubenswrapper[5094]: I0220 06:48:31.839305 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:31 crc kubenswrapper[5094]: E0220 06:48:31.839630 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:32 crc kubenswrapper[5094]: I0220 06:48:32.839409 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:32 crc kubenswrapper[5094]: I0220 06:48:32.839521 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:32 crc kubenswrapper[5094]: I0220 06:48:32.839779 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:32 crc kubenswrapper[5094]: E0220 06:48:32.839958 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:32 crc kubenswrapper[5094]: E0220 06:48:32.840191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:32 crc kubenswrapper[5094]: E0220 06:48:32.840450 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:33 crc kubenswrapper[5094]: I0220 06:48:33.840019 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:33 crc kubenswrapper[5094]: E0220 06:48:33.840254 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:34 crc kubenswrapper[5094]: I0220 06:48:34.839582 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:34 crc kubenswrapper[5094]: I0220 06:48:34.839655 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:34 crc kubenswrapper[5094]: I0220 06:48:34.839938 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:34 crc kubenswrapper[5094]: E0220 06:48:34.840111 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:34 crc kubenswrapper[5094]: E0220 06:48:34.839944 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:34 crc kubenswrapper[5094]: E0220 06:48:34.840292 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:35 crc kubenswrapper[5094]: I0220 06:48:35.840504 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:35 crc kubenswrapper[5094]: I0220 06:48:35.840616 5094 scope.go:117] "RemoveContainer" containerID="aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79" Feb 20 06:48:35 crc kubenswrapper[5094]: E0220 06:48:35.840758 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:35 crc kubenswrapper[5094]: E0220 06:48:35.978281 5094 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.773988 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.774090 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"0e16f340af41f0cc3bffd1d98c9695dc8ad9491384da855c5637478b18c6f793"} Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.839473 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.839529 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:36 crc kubenswrapper[5094]: I0220 06:48:36.839494 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:36 crc kubenswrapper[5094]: E0220 06:48:36.839667 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:36 crc kubenswrapper[5094]: E0220 06:48:36.839747 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:36 crc kubenswrapper[5094]: E0220 06:48:36.839850 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:37 crc kubenswrapper[5094]: I0220 06:48:37.839920 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:37 crc kubenswrapper[5094]: E0220 06:48:37.841306 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:38 crc kubenswrapper[5094]: I0220 06:48:38.839648 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:38 crc kubenswrapper[5094]: I0220 06:48:38.839760 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:38 crc kubenswrapper[5094]: E0220 06:48:38.839875 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:38 crc kubenswrapper[5094]: I0220 06:48:38.839795 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:38 crc kubenswrapper[5094]: E0220 06:48:38.839976 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:38 crc kubenswrapper[5094]: E0220 06:48:38.840087 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:39 crc kubenswrapper[5094]: I0220 06:48:39.840977 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:39 crc kubenswrapper[5094]: E0220 06:48:39.841185 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 20 06:48:40 crc kubenswrapper[5094]: I0220 06:48:40.839531 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:40 crc kubenswrapper[5094]: I0220 06:48:40.839632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:40 crc kubenswrapper[5094]: I0220 06:48:40.839642 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:40 crc kubenswrapper[5094]: E0220 06:48:40.839694 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 20 06:48:40 crc kubenswrapper[5094]: E0220 06:48:40.839834 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 20 06:48:40 crc kubenswrapper[5094]: E0220 06:48:40.839989 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8ww4n" podUID="da0aa093-1adc-45f2-a942-e68d7be23ed4" Feb 20 06:48:41 crc kubenswrapper[5094]: I0220 06:48:41.839464 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:41 crc kubenswrapper[5094]: I0220 06:48:41.866167 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 06:48:41 crc kubenswrapper[5094]: I0220 06:48:41.866244 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.839685 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.839985 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.840164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.843208 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.843914 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.844168 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 06:48:42 crc kubenswrapper[5094]: I0220 06:48:42.844427 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.486577 5094 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.536060 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xj788"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.536777 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.537239 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.540901 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.541118 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.542798 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.549253 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-p4blk"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.549625 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.550647 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.572274 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.573199 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.573647 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bwvdt"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.574232 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.574257 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.574978 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.575045 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.575579 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.576680 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.576754 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.576958 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.576683 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577183 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577320 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577465 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577591 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577794 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.577939 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.578278 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.578431 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.578596 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.580341 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.580981 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.590522 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ps6pv"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.590969 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.591250 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.591519 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9fck"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.592037 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.592343 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.592523 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.581179 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.593321 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.593578 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.581633 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.594850 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.586303 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.586878 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.586961 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.587048 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.587281 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.587316 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.587376 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.588245 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.588305 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.589198 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.595843 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.600999 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601055 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601105 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601148 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601182 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601195 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601222 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601254 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601286 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601307 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601319 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601360 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601377 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601399 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601427 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601507 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.601537 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.605825 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.606828 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.607337 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.608373 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.609090 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.609230 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.609331 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.609884 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.610826 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.610917 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.611035 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.611398 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.611761 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.611870 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612085 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612280 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612408 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612563 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612687 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.612957 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613116 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613158 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613254 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613333 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613403 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613472 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613546 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613590 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613689 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613865 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.613993 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614046 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614006 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614098 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614134 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614161 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614192 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614225 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614226 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614278 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614313 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.614586 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.616923 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.626563 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.643478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.644835 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.644864 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.646606 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.646890 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647086 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647123 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w7rf2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647615 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647860 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.647928 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.650211 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.651104 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.657271 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.657865 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.660884 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.663762 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.663961 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.664064 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.666860 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.668005 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.668425 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.668837 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-d2l2r"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.669692 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q596q"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.670196 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.670829 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.671759 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.672654 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.674799 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7wgh7"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.676271 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.679611 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.680915 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.681089 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.682511 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.684247 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9sztr"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.685234 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.685447 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.685766 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.686216 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.687006 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.687202 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.706360 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.708632 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709009 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x72n5"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709211 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e594f26b-0fd6-44a1-93eb-84593591389f-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709251 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrllp\" (UniqueName: \"kubernetes.io/projected/e594f26b-0fd6-44a1-93eb-84593591389f-kube-api-access-jrllp\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709280 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-config\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709302 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709320 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709342 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-service-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709357 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit-dir\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709391 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709409 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/38d9642e-3788-4e70-8232-138cd84e02dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709427 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709450 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1974d27-b923-4a9b-9874-d400df5bd29a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709465 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709486 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709507 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-auth-proxy-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709535 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f20c574-b730-4bd8-97d1-7751eb7968d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709550 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-encryption-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709566 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1974d27-b923-4a9b-9874-d400df5bd29a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709582 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdjs5\" (UniqueName: \"kubernetes.io/projected/2f348b60-0d81-490e-bfb4-ea32546c995a-kube-api-access-sdjs5\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlgjz\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-kube-api-access-rlgjz\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709621 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-dir\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709639 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709696 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-client\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709724 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-serving-cert\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f348b60-0d81-490e-bfb4-ea32546c995a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709759 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709774 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709790 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78b7a6b-91b7-4753-bd82-df9d3ea97291-config\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2mff\" (UniqueName: \"kubernetes.io/projected/bac53d01-ed38-46a8-ae9e-bfb72e5565a1-kube-api-access-g2mff\") pod \"migrator-59844c95c7-7hljp\" (UID: \"bac53d01-ed38-46a8-ae9e-bfb72e5565a1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709845 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-client\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709861 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8tl\" (UniqueName: \"kubernetes.io/projected/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-kube-api-access-6k8tl\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709888 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-encryption-config\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709906 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f20c574-b730-4bd8-97d1-7751eb7968d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709925 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7jjp\" (UniqueName: \"kubernetes.io/projected/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-kube-api-access-d7jjp\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709943 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pw6d\" (UniqueName: \"kubernetes.io/projected/38e3be97-7374-4b8b-9565-4d60baa02401-kube-api-access-8pw6d\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709977 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.709993 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-node-pullsecrets\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710007 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710022 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710040 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710056 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78b7a6b-91b7-4753-bd82-df9d3ea97291-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.710961 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711499 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711565 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711600 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711630 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711656 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e78b7a6b-91b7-4753-bd82-df9d3ea97291-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711718 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711738 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-serving-cert\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711788 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e594f26b-0fd6-44a1-93eb-84593591389f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711832 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mlc\" (UniqueName: \"kubernetes.io/projected/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-kube-api-access-d4mlc\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.711856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f88s\" (UniqueName: \"kubernetes.io/projected/5021cb92-f82d-47ee-9978-58e897c354b1-kube-api-access-9f88s\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712002 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1974d27-b923-4a9b-9874-d400df5bd29a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712026 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-images\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712051 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-machine-approver-tls\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712077 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-image-import-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712121 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712194 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38e3be97-7374-4b8b-9565-4d60baa02401-serving-cert\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712254 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-policies\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712303 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712325 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712404 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712449 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-serving-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712470 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5qm\" (UniqueName: \"kubernetes.io/projected/7b5c64ae-5f80-4e35-91dc-48163991b63d-kube-api-access-mc5qm\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712486 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712509 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712526 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmb5r\" (UniqueName: \"kubernetes.io/projected/5f20c574-b730-4bd8-97d1-7751eb7968d4-kube-api-access-cmb5r\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712562 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712580 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5021cb92-f82d-47ee-9978-58e897c354b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712618 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-config\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712638 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvplx\" (UniqueName: \"kubernetes.io/projected/38d9642e-3788-4e70-8232-138cd84e02dc-kube-api-access-lvplx\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.712913 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.713005 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.726179 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.728192 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.728653 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.730035 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.730555 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.731120 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vhztc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.731732 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.732529 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.732964 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.733185 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.733578 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.733617 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.733730 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.742849 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.742934 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.743780 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.743828 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nlpvl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744202 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744299 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744650 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744851 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.744894 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.745356 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.746880 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.747835 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ps6pv"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.748814 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.750452 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.751560 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bwvdt"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.752234 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.753243 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.754855 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tjkwm"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.756342 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.756632 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.758214 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.759470 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.763766 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.764638 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.767457 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.769080 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.771136 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d2l2r"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.773495 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9fck"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.775051 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w7rf2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.776494 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q596q"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.778984 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.780793 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x72n5"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.781898 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.782497 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.784214 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nlpvl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.785032 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-p4blk"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.785939 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7wgh7"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.787083 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l2hxn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.788153 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.788309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.789182 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-v68px"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.789624 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.790267 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.791349 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.792405 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.793486 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.794730 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tjkwm"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.795582 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.797110 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vhztc"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.797976 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.799483 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.800159 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.801458 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.802302 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.803051 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2hxn"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.803546 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wcwdv"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.805112 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wcwdv"] Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.805228 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814818 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814875 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814922 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814947 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.814981 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e78b7a6b-91b7-4753-bd82-df9d3ea97291-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815011 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815039 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815062 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815094 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-serving-cert\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815122 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e594f26b-0fd6-44a1-93eb-84593591389f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815179 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mlc\" (UniqueName: \"kubernetes.io/projected/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-kube-api-access-d4mlc\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815213 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-default-certificate\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815244 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f88s\" (UniqueName: \"kubernetes.io/projected/5021cb92-f82d-47ee-9978-58e897c354b1-kube-api-access-9f88s\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815307 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1974d27-b923-4a9b-9874-d400df5bd29a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815334 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-images\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815360 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-machine-approver-tls\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815381 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-image-import-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815409 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815438 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815463 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38e3be97-7374-4b8b-9565-4d60baa02401-serving-cert\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815487 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-policies\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815511 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815540 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815573 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn6jl\" (UniqueName: \"kubernetes.io/projected/f1faaf31-f0b6-4828-90cc-51de060dc826-kube-api-access-bn6jl\") pod \"downloads-7954f5f757-d2l2r\" (UID: \"f1faaf31-f0b6-4828-90cc-51de060dc826\") " pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815600 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815626 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-serving-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815694 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5qm\" (UniqueName: \"kubernetes.io/projected/7b5c64ae-5f80-4e35-91dc-48163991b63d-kube-api-access-mc5qm\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815738 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815766 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815802 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmb5r\" (UniqueName: \"kubernetes.io/projected/5f20c574-b730-4bd8-97d1-7751eb7968d4-kube-api-access-cmb5r\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815832 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815862 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5021cb92-f82d-47ee-9978-58e897c354b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-stats-auth\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-config\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815956 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvplx\" (UniqueName: \"kubernetes.io/projected/38d9642e-3788-4e70-8232-138cd84e02dc-kube-api-access-lvplx\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.815981 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816038 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e594f26b-0fd6-44a1-93eb-84593591389f-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816069 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrllp\" (UniqueName: \"kubernetes.io/projected/e594f26b-0fd6-44a1-93eb-84593591389f-kube-api-access-jrllp\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816100 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-config\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816127 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816155 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816182 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-service-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816184 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6908ab3-3d33-4e31-b226-b6607f34ee8b-service-ca-bundle\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816327 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit-dir\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816388 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816453 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/38d9642e-3788-4e70-8232-138cd84e02dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816500 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816552 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1974d27-b923-4a9b-9874-d400df5bd29a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816628 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-auth-proxy-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816763 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f20c574-b730-4bd8-97d1-7751eb7968d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816796 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-encryption-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816835 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1974d27-b923-4a9b-9874-d400df5bd29a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816873 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdjs5\" (UniqueName: \"kubernetes.io/projected/2f348b60-0d81-490e-bfb4-ea32546c995a-kube-api-access-sdjs5\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816913 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlgjz\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-kube-api-access-rlgjz\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816956 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-dir\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.816991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfv25\" (UniqueName: \"kubernetes.io/projected/a6908ab3-3d33-4e31-b226-b6607f34ee8b-kube-api-access-dfv25\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817072 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817104 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-client\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817172 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-serving-cert\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817215 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f348b60-0d81-490e-bfb4-ea32546c995a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817283 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817321 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78b7a6b-91b7-4753-bd82-df9d3ea97291-config\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2mff\" (UniqueName: \"kubernetes.io/projected/bac53d01-ed38-46a8-ae9e-bfb72e5565a1-kube-api-access-g2mff\") pod \"migrator-59844c95c7-7hljp\" (UID: \"bac53d01-ed38-46a8-ae9e-bfb72e5565a1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817403 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817436 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-client\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817473 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8tl\" (UniqueName: \"kubernetes.io/projected/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-kube-api-access-6k8tl\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817534 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-encryption-config\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817564 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f20c574-b730-4bd8-97d1-7751eb7968d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817620 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7jjp\" (UniqueName: \"kubernetes.io/projected/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-kube-api-access-d7jjp\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817734 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pw6d\" (UniqueName: \"kubernetes.io/projected/38e3be97-7374-4b8b-9565-4d60baa02401-kube-api-access-8pw6d\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817810 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817854 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-node-pullsecrets\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817894 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817932 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.817963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-metrics-certs\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.818996 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.819070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.819120 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78b7a6b-91b7-4753-bd82-df9d3ea97291-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.819164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.819988 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.820128 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.820411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-images\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.820963 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.821171 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1974d27-b923-4a9b-9874-d400df5bd29a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.822793 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-auth-proxy-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.823423 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f20c574-b730-4bd8-97d1-7751eb7968d4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.823620 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e594f26b-0fd6-44a1-93eb-84593591389f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.824096 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-audit-dir\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.824155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-dir\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.824726 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.824838 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-encryption-config\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.825428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.825575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.825614 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-serving-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.825979 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-encryption-config\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.826443 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b5c64ae-5f80-4e35-91dc-48163991b63d-node-pullsecrets\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.826611 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827152 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827166 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-image-import-ca\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827315 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.827898 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b5c64ae-5f80-4e35-91dc-48163991b63d-trusted-ca-bundle\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.828182 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e78b7a6b-91b7-4753-bd82-df9d3ea97291-config\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.828863 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.828921 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f348b60-0d81-490e-bfb4-ea32546c995a-config\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.829963 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830075 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830107 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830128 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830282 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830490 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830592 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.830781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831130 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831301 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-config\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831509 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f20c574-b730-4bd8-97d1-7751eb7968d4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831882 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2f348b60-0d81-490e-bfb4-ea32546c995a-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.831973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.832038 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/38d9642e-3788-4e70-8232-138cd84e02dc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.826361 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-config\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.832792 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-machine-approver-tls\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.832863 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1974d27-b923-4a9b-9874-d400df5bd29a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.832907 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38e3be97-7374-4b8b-9565-4d60baa02401-service-ca-bundle\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.833399 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-audit-policies\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.834536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.835440 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e594f26b-0fd6-44a1-93eb-84593591389f-serving-cert\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5021cb92-f82d-47ee-9978-58e897c354b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837225 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837302 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837397 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-etcd-client\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837596 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-serving-cert\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837836 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.837964 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.838271 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-serving-cert\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.838444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.840379 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7b5c64ae-5f80-4e35-91dc-48163991b63d-etcd-client\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.840533 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38e3be97-7374-4b8b-9565-4d60baa02401-serving-cert\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.840558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e78b7a6b-91b7-4753-bd82-df9d3ea97291-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.841091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.863477 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.882874 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.903497 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-default-certificate\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920458 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn6jl\" (UniqueName: \"kubernetes.io/projected/f1faaf31-f0b6-4828-90cc-51de060dc826-kube-api-access-bn6jl\") pod \"downloads-7954f5f757-d2l2r\" (UID: \"f1faaf31-f0b6-4828-90cc-51de060dc826\") " pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920529 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-stats-auth\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920597 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6908ab3-3d33-4e31-b226-b6607f34ee8b-service-ca-bundle\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.920659 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfv25\" (UniqueName: \"kubernetes.io/projected/a6908ab3-3d33-4e31-b226-b6607f34ee8b-kube-api-access-dfv25\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.921774 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.921820 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-metrics-certs\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.921845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.921882 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.942617 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.961822 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 06:48:45 crc kubenswrapper[5094]: I0220 06:48:45.983493 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.002166 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.022586 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.041583 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.061762 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.082670 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.102883 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.123336 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.142721 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.163165 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.182022 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.204010 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.212388 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6908ab3-3d33-4e31-b226-b6607f34ee8b-service-ca-bundle\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.224208 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.243655 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.260772 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-metrics-certs\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.262220 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.276774 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-stats-auth\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.283920 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.304016 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.324133 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.342559 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.363597 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.383128 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.402532 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.435456 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.442675 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.464685 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.476218 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a6908ab3-3d33-4e31-b226-b6607f34ee8b-default-certificate\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.482238 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.503295 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.523343 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.542685 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.562417 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.582862 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.603142 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.623242 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.642926 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.664007 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.682443 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.726022 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.728878 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.741394 5094 request.go:700] Waited for 1.011359077s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.743611 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.763120 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.783558 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.804558 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.824979 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.856303 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.862533 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.882750 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.902123 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.916534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:46 crc kubenswrapper[5094]: E0220 06:48:46.922966 5094 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Feb 20 06:48:46 crc kubenswrapper[5094]: E0220 06:48:46.923128 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume podName:806ba791-714c-4d13-b595-d4f6ccf06aea nodeName:}" failed. No retries permitted until 2026-02-20 06:48:47.423092592 +0000 UTC m=+142.295719343 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume") pod "collect-profiles-29526165-tdww4" (UID: "806ba791-714c-4d13-b595-d4f6ccf06aea") : failed to sync configmap cache: timed out waiting for the condition Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.928264 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.943221 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.963922 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 06:48:46 crc kubenswrapper[5094]: I0220 06:48:46.983395 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.003581 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.023832 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.043795 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.062427 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.082993 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.103305 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.142543 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.164848 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.183471 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.203781 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.223871 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.243629 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.262794 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.282234 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.302765 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.322971 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.343242 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.363524 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.383467 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.403225 5094 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.424183 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.442942 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.446054 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.447657 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.464010 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.482684 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.503514 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.523470 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.543331 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.563114 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.582825 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.603312 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.651627 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e78b7a6b-91b7-4753-bd82-df9d3ea97291-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c2mfb\" (UID: \"e78b7a6b-91b7-4753-bd82-df9d3ea97291\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.673204 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") pod \"route-controller-manager-6576b87f9c-qrtpl\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.692126 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") pod \"controller-manager-879f6c89f-fnbl8\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.707046 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.711345 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mlc\" (UniqueName: \"kubernetes.io/projected/30a7db23-c18c-4bc6-b1b7-97b32a419fbe-kube-api-access-d4mlc\") pod \"openshift-apiserver-operator-796bbdcf4f-7k4rg\" (UID: \"30a7db23-c18c-4bc6-b1b7-97b32a419fbe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.728091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f88s\" (UniqueName: \"kubernetes.io/projected/5021cb92-f82d-47ee-9978-58e897c354b1-kube-api-access-9f88s\") pod \"cluster-samples-operator-665b6dd947-j2mmj\" (UID: \"5021cb92-f82d-47ee-9978-58e897c354b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.739398 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdjs5\" (UniqueName: \"kubernetes.io/projected/2f348b60-0d81-490e-bfb4-ea32546c995a-kube-api-access-sdjs5\") pod \"machine-api-operator-5694c8668f-ps6pv\" (UID: \"2f348b60-0d81-490e-bfb4-ea32546c995a\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.758743 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlgjz\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-kube-api-access-rlgjz\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.760522 5094 request.go:700] Waited for 1.935615114s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.760576 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.793000 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8d0112e-e85e-42f1-b28b-c0c996f36fe0-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-wzftj\" (UID: \"f8d0112e-e85e-42f1-b28b-c0c996f36fe0\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.817177 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrllp\" (UniqueName: \"kubernetes.io/projected/e594f26b-0fd6-44a1-93eb-84593591389f-kube-api-access-jrllp\") pod \"openshift-config-operator-7777fb866f-p9fck\" (UID: \"e594f26b-0fd6-44a1-93eb-84593591389f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.827549 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.828016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") pod \"oauth-openshift-558db77b4-bslb9\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.843481 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7jjp\" (UniqueName: \"kubernetes.io/projected/d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2-kube-api-access-d7jjp\") pod \"machine-approver-56656f9798-xj788\" (UID: \"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.851649 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.859427 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.863746 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pw6d\" (UniqueName: \"kubernetes.io/projected/38e3be97-7374-4b8b-9565-4d60baa02401-kube-api-access-8pw6d\") pod \"authentication-operator-69f744f599-bwvdt\" (UID: \"38e3be97-7374-4b8b-9565-4d60baa02401\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.868942 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.887819 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.895506 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5qm\" (UniqueName: \"kubernetes.io/projected/7b5c64ae-5f80-4e35-91dc-48163991b63d-kube-api-access-mc5qm\") pod \"apiserver-76f77b778f-p4blk\" (UID: \"7b5c64ae-5f80-4e35-91dc-48163991b63d\") " pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.897858 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.904856 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2mff\" (UniqueName: \"kubernetes.io/projected/bac53d01-ed38-46a8-ae9e-bfb72e5565a1-kube-api-access-g2mff\") pod \"migrator-59844c95c7-7hljp\" (UID: \"bac53d01-ed38-46a8-ae9e-bfb72e5565a1\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.909022 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.920540 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1974d27-b923-4a9b-9874-d400df5bd29a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-ddvpd\" (UID: \"d1974d27-b923-4a9b-9874-d400df5bd29a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.940550 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8tl\" (UniqueName: \"kubernetes.io/projected/8f8623a7-b3d4-49ad-86c5-40f19adf7b09-kube-api-access-6k8tl\") pod \"apiserver-7bbb656c7d-bhls2\" (UID: \"8f8623a7-b3d4-49ad-86c5-40f19adf7b09\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.966399 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmb5r\" (UniqueName: \"kubernetes.io/projected/5f20c574-b730-4bd8-97d1-7751eb7968d4-kube-api-access-cmb5r\") pod \"openshift-controller-manager-operator-756b6f6bc6-vs599\" (UID: \"5f20c574-b730-4bd8-97d1-7751eb7968d4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.970116 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.977613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvplx\" (UniqueName: \"kubernetes.io/projected/38d9642e-3788-4e70-8232-138cd84e02dc-kube-api-access-lvplx\") pod \"control-plane-machine-set-operator-78cbb6b69f-znrdm\" (UID: \"38d9642e-3788-4e70-8232-138cd84e02dc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:47 crc kubenswrapper[5094]: I0220 06:48:47.988720 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.023077 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.024577 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.026757 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfv25\" (UniqueName: \"kubernetes.io/projected/a6908ab3-3d33-4e31-b226-b6607f34ee8b-kube-api-access-dfv25\") pod \"router-default-5444994796-9sztr\" (UID: \"a6908ab3-3d33-4e31-b226-b6607f34ee8b\") " pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.038723 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.048337 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") pod \"collect-profiles-29526165-tdww4\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.061528 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn6jl\" (UniqueName: \"kubernetes.io/projected/f1faaf31-f0b6-4828-90cc-51de060dc826-kube-api-access-bn6jl\") pod \"downloads-7954f5f757-d2l2r\" (UID: \"f1faaf31-f0b6-4828-90cc-51de060dc826\") " pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.070352 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.079902 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.098783 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.103343 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:48:48 crc kubenswrapper[5094]: W0220 06:48:48.106143 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd63cfc9b_e8ee_4b2a_8f36_f335dc660ca5.slice/crio-94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7 WatchSource:0}: Error finding container 94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7: Status 404 returned error can't find the container with id 94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7 Feb 20 06:48:48 crc kubenswrapper[5094]: W0220 06:48:48.129749 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18cc290d_78be_42c6_af5b_3b8b86941eb2.slice/crio-a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60 WatchSource:0}: Error finding container a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60: Status 404 returned error can't find the container with id a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60 Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.143619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160393 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32eb7f4-e823-4b71-9606-d3dee9f247fd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160430 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160524 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160551 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160566 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-config\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160585 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4mjl\" (UniqueName: \"kubernetes.io/projected/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-kube-api-access-n4mjl\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160608 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4797e67f-42c7-4106-998a-f3555218e77d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160648 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160665 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-srv-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160681 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160727 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160745 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160779 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c3bc896-4d90-42f0-92e9-77a7b285e504-serving-cert\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160796 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4js2k\" (UniqueName: \"kubernetes.io/projected/95700d83-436d-43c5-9eb1-381654f43928-kube-api-access-4js2k\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160815 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a072f264-8eef-49ff-804c-fc584b41175c-metrics-tls\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d4a320a-daa4-4bce-9782-5e9880aea226-metrics-tls\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160940 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-serving-cert\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.160997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b9035c-78ce-4d54-859d-48f7853f3f16-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161108 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161194 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161222 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqv8\" (UniqueName: \"kubernetes.io/projected/04b9035c-78ce-4d54-859d-48f7853f3f16-kube-api-access-9fqv8\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-trusted-ca\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161277 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-proxy-tls\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161342 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161417 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161521 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b32eb7f4-e823-4b71-9606-d3dee9f247fd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161542 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4797e67f-42c7-4106-998a-f3555218e77d-config\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161588 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42pbk\" (UniqueName: \"kubernetes.io/projected/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-kube-api-access-42pbk\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161606 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-service-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161642 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161659 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4797e67f-42c7-4106-998a-f3555218e77d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161732 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a072f264-8eef-49ff-804c-fc584b41175c-trusted-ca\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161785 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-images\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161805 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n54s8\" (UniqueName: \"kubernetes.io/projected/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-kube-api-access-n54s8\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161896 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4cl2\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-kube-api-access-b4cl2\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161923 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5spz\" (UniqueName: \"kubernetes.io/projected/b32eb7f4-e823-4b71-9606-d3dee9f247fd-kube-api-access-s5spz\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161967 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3bc896-4d90-42f0-92e9-77a7b285e504-config\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.161991 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56s2x\" (UniqueName: \"kubernetes.io/projected/9d4a320a-daa4-4bce-9782-5e9880aea226-kube-api-access-56s2x\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162059 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-client\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162090 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162126 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5g52\" (UniqueName: \"kubernetes.io/projected/3c3bc896-4d90-42f0-92e9-77a7b285e504-kube-api-access-f5g52\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95700d83-436d-43c5-9eb1-381654f43928-serving-cert\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162201 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162316 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkrp\" (UniqueName: \"kubernetes.io/projected/cbae269e-22bc-484c-ad96-ad61d462a28d-kube-api-access-tvkrp\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162379 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-config\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162484 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162503 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162520 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b9035c-78ce-4d54-859d-48f7853f3f16-proxy-tls\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.162560 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.164195 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.664178819 +0000 UTC m=+143.536805530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.180331 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.215442 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265528 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48jfb\" (UniqueName: \"kubernetes.io/projected/4a51eb16-597c-47dc-bd54-c16c33bde071-kube-api-access-48jfb\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b9035c-78ce-4d54-859d-48f7853f3f16-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.265832 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.765797176 +0000 UTC m=+143.638423887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265922 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm8lg\" (UniqueName: \"kubernetes.io/projected/55946b30-00e1-4bd4-bd8e-3f5761537a0b-kube-api-access-lm8lg\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.265952 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-apiservice-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266002 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266036 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8csvq\" (UniqueName: \"kubernetes.io/projected/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-kube-api-access-8csvq\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266094 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266178 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-trusted-ca\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266212 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqv8\" (UniqueName: \"kubernetes.io/projected/04b9035c-78ce-4d54-859d-48f7853f3f16-kube-api-access-9fqv8\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266259 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-proxy-tls\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266288 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5ljv\" (UniqueName: \"kubernetes.io/projected/368766ec-f562-4296-bcdb-4bcda1db6c45-kube-api-access-n5ljv\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266405 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266434 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-certs\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266457 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4a51eb16-597c-47dc-bd54-c16c33bde071-tmpfs\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266489 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-config-volume\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266575 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-plugins-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266611 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-node-bootstrap-token\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266652 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42pbk\" (UniqueName: \"kubernetes.io/projected/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-kube-api-access-42pbk\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266680 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b32eb7f4-e823-4b71-9606-d3dee9f247fd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266724 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4797e67f-42c7-4106-998a-f3555218e77d-config\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266750 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qksp7\" (UniqueName: \"kubernetes.io/projected/a4c7d510-2730-46e1-b157-6e890e8868e9-kube-api-access-qksp7\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266796 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-service-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-srv-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266862 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4797e67f-42c7-4106-998a-f3555218e77d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266930 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266959 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n54s8\" (UniqueName: \"kubernetes.io/projected/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-kube-api-access-n54s8\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.266992 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a072f264-8eef-49ff-804c-fc584b41175c-trusted-ca\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-images\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-registration-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267088 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndpg9\" (UniqueName: \"kubernetes.io/projected/fc07e658-5bc5-469e-b793-230b7be58f12-kube-api-access-ndpg9\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267116 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4cl2\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-kube-api-access-b4cl2\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267170 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-mountpoint-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267199 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5spz\" (UniqueName: \"kubernetes.io/projected/b32eb7f4-e823-4b71-9606-d3dee9f247fd-kube-api-access-s5spz\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267233 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-profile-collector-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267264 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3bc896-4d90-42f0-92e9-77a7b285e504-config\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56s2x\" (UniqueName: \"kubernetes.io/projected/9d4a320a-daa4-4bce-9782-5e9880aea226-kube-api-access-56s2x\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267318 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-client\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267412 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c7d510-2730-46e1-b157-6e890e8868e9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267448 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5g52\" (UniqueName: \"kubernetes.io/projected/3c3bc896-4d90-42f0-92e9-77a7b285e504-kube-api-access-f5g52\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267484 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95700d83-436d-43c5-9eb1-381654f43928-serving-cert\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267521 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267577 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvkrp\" (UniqueName: \"kubernetes.io/projected/cbae269e-22bc-484c-ad96-ad61d462a28d-kube-api-access-tvkrp\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267602 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vwzw\" (UniqueName: \"kubernetes.io/projected/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-kube-api-access-2vwzw\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267694 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-config\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267806 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267833 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b9035c-78ce-4d54-859d-48f7853f3f16-proxy-tls\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.267888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32eb7f4-e823-4b71-9606-d3dee9f247fd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.268077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.273843 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.273901 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-csi-data-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.273966 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4mjl\" (UniqueName: \"kubernetes.io/projected/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-kube-api-access-n4mjl\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4797e67f-42c7-4106-998a-f3555218e77d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274061 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-config\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274092 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274124 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-key\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274152 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274178 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-srv-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274208 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc07e658-5bc5-469e-b793-230b7be58f12-cert\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274260 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274288 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274318 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4js2k\" (UniqueName: \"kubernetes.io/projected/95700d83-436d-43c5-9eb1-381654f43928-kube-api-access-4js2k\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274350 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-metrics-tls\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274381 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274402 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c3bc896-4d90-42f0-92e9-77a7b285e504-serving-cert\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274436 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a072f264-8eef-49ff-804c-fc584b41175c-metrics-tls\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274461 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-cabundle\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274520 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-webhook-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274549 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274578 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb99w\" (UniqueName: \"kubernetes.io/projected/1177f137-190b-4563-8a6f-51d7b0d5ca9c-kube-api-access-mb99w\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274607 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-socket-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274654 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d4a320a-daa4-4bce-9782-5e9880aea226-metrics-tls\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274688 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-serving-cert\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.274676 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/04b9035c-78ce-4d54-859d-48f7853f3f16-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.283677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.284658 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c3bc896-4d90-42f0-92e9-77a7b285e504-config\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.285691 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.291635 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.293575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a072f264-8eef-49ff-804c-fc584b41175c-trusted-ca\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.295657 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-trusted-ca\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.296089 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.296572 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-images\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.302812 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4797e67f-42c7-4106-998a-f3555218e77d-config\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.305112 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.305833 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-config\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.309342 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-auth-proxy-config\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.310033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.317768 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.817740406 +0000 UTC m=+143.690367117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.320503 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b32eb7f4-e823-4b71-9606-d3dee9f247fd-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.322450 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-proxy-tls\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.323886 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95700d83-436d-43c5-9eb1-381654f43928-config\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.324996 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.325387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.328642 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.328954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-service-ca\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.342245 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a072f264-8eef-49ff-804c-fc584b41175c-metrics-tls\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.342855 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.343339 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.343531 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.344031 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.347851 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351028 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/04b9035c-78ce-4d54-859d-48f7853f3f16-proxy-tls\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-serving-cert\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351137 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351146 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/cbae269e-22bc-484c-ad96-ad61d462a28d-srv-cert\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d4a320a-daa4-4bce-9782-5e9880aea226-metrics-tls\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95700d83-436d-43c5-9eb1-381654f43928-serving-cert\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351652 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-etcd-client\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351871 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.351942 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3c3bc896-4d90-42f0-92e9-77a7b285e504-serving-cert\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.352391 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.358116 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4js2k\" (UniqueName: \"kubernetes.io/projected/95700d83-436d-43c5-9eb1-381654f43928-kube-api-access-4js2k\") pod \"console-operator-58897d9998-w7rf2\" (UID: \"95700d83-436d-43c5-9eb1-381654f43928\") " pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.358971 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b32eb7f4-e823-4b71-9606-d3dee9f247fd-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.361129 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") pod \"console-f9d7485db-shq4j\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.364264 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375233 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.375434 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.875394581 +0000 UTC m=+143.748021292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375480 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-socket-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375516 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48jfb\" (UniqueName: \"kubernetes.io/projected/4a51eb16-597c-47dc-bd54-c16c33bde071-kube-api-access-48jfb\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375565 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm8lg\" (UniqueName: \"kubernetes.io/projected/55946b30-00e1-4bd4-bd8e-3f5761537a0b-kube-api-access-lm8lg\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375588 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-apiservice-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375614 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8csvq\" (UniqueName: \"kubernetes.io/projected/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-kube-api-access-8csvq\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5ljv\" (UniqueName: \"kubernetes.io/projected/368766ec-f562-4296-bcdb-4bcda1db6c45-kube-api-access-n5ljv\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375684 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-certs\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375722 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4a51eb16-597c-47dc-bd54-c16c33bde071-tmpfs\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375741 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-config-volume\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375763 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-plugins-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-node-bootstrap-token\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375829 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-socket-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375830 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qksp7\" (UniqueName: \"kubernetes.io/projected/a4c7d510-2730-46e1-b157-6e890e8868e9-kube-api-access-qksp7\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375919 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-srv-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.375988 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376042 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-registration-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376068 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndpg9\" (UniqueName: \"kubernetes.io/projected/fc07e658-5bc5-469e-b793-230b7be58f12-kube-api-access-ndpg9\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-mountpoint-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376087 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-plugins-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376148 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-profile-collector-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376188 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c7d510-2730-46e1-b157-6e890e8868e9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376256 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vwzw\" (UniqueName: \"kubernetes.io/projected/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-kube-api-access-2vwzw\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376352 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-registration-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376381 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-csi-data-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.376466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-mountpoint-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377176 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-key\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377212 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc07e658-5bc5-469e-b793-230b7be58f12-cert\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377247 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-metrics-tls\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377257 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/368766ec-f562-4296-bcdb-4bcda1db6c45-csi-data-dir\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377284 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-webhook-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377503 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-cabundle\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377572 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb99w\" (UniqueName: \"kubernetes.io/projected/1177f137-190b-4563-8a6f-51d7b0d5ca9c-kube-api-access-mb99w\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.377774 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-config-volume\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.377802 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.877779227 +0000 UTC m=+143.750405938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.378674 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-cabundle\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.379145 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4a51eb16-597c-47dc-bd54-c16c33bde071-tmpfs\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.381414 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a4c7d510-2730-46e1-b157-6e890e8868e9-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.381781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-srv-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.382510 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-webhook-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.384338 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-metrics-tls\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.384514 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-signing-key\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.385788 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56s2x\" (UniqueName: \"kubernetes.io/projected/9d4a320a-daa4-4bce-9782-5e9880aea226-kube-api-access-56s2x\") pod \"dns-operator-744455d44c-q596q\" (UID: \"9d4a320a-daa4-4bce-9782-5e9880aea226\") " pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.385916 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4797e67f-42c7-4106-998a-f3555218e77d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.386743 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-node-bootstrap-token\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.390988 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fc07e658-5bc5-469e-b793-230b7be58f12-cert\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.392749 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1177f137-190b-4563-8a6f-51d7b0d5ca9c-profile-collector-cert\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.393472 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4a51eb16-597c-47dc-bd54-c16c33bde071-apiservice-cert\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.394021 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/55946b30-00e1-4bd4-bd8e-3f5761537a0b-certs\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.410343 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4mjl\" (UniqueName: \"kubernetes.io/projected/7a75a178-3fa3-4be7-b29f-e1f01dc859a4-kube-api-access-n4mjl\") pod \"multus-admission-controller-857f4d67dd-x72n5\" (UID: \"7a75a178-3fa3-4be7-b29f-e1f01dc859a4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.432664 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5g52\" (UniqueName: \"kubernetes.io/projected/3c3bc896-4d90-42f0-92e9-77a7b285e504-kube-api-access-f5g52\") pod \"service-ca-operator-777779d784-vhztc\" (UID: \"3c3bc896-4d90-42f0-92e9-77a7b285e504\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.453128 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.454712 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.457301 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqv8\" (UniqueName: \"kubernetes.io/projected/04b9035c-78ce-4d54-859d-48f7853f3f16-kube-api-access-9fqv8\") pod \"machine-config-controller-84d6567774-rwdb7\" (UID: \"04b9035c-78ce-4d54-859d-48f7853f3f16\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.458820 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-p9fck"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.462067 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5spz\" (UniqueName: \"kubernetes.io/projected/b32eb7f4-e823-4b71-9606-d3dee9f247fd-kube-api-access-s5spz\") pod \"kube-storage-version-migrator-operator-b67b599dd-jjblc\" (UID: \"b32eb7f4-e823-4b71-9606-d3dee9f247fd\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.480532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.481407 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:48.98135758 +0000 UTC m=+143.853984291 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.484874 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4cl2\" (UniqueName: \"kubernetes.io/projected/a072f264-8eef-49ff-804c-fc584b41175c-kube-api-access-b4cl2\") pod \"ingress-operator-5b745b69d9-jqgq2\" (UID: \"a072f264-8eef-49ff-804c-fc584b41175c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.508142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: W0220 06:48:48.516111 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode594f26b_0fd6_44a1_93eb_84593591389f.slice/crio-fde96eaf411b08b987ea2a5695c1132631ea62a21b4b7d86b396dda3b9327c24 WatchSource:0}: Error finding container fde96eaf411b08b987ea2a5695c1132631ea62a21b4b7d86b396dda3b9327c24: Status 404 returned error can't find the container with id fde96eaf411b08b987ea2a5695c1132631ea62a21b4b7d86b396dda3b9327c24 Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.529970 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvkrp\" (UniqueName: \"kubernetes.io/projected/cbae269e-22bc-484c-ad96-ad61d462a28d-kube-api-access-tvkrp\") pod \"olm-operator-6b444d44fb-57pxl\" (UID: \"cbae269e-22bc-484c-ad96-ad61d462a28d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.544244 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") pod \"marketplace-operator-79b997595-45dt8\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.561151 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n54s8\" (UniqueName: \"kubernetes.io/projected/d74a0c7b-3ad6-4f59-b4ff-33e1209c3116-kube-api-access-n54s8\") pod \"machine-config-operator-74547568cd-s2xnj\" (UID: \"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.580148 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4797e67f-42c7-4106-998a-f3555218e77d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-9z5jc\" (UID: \"4797e67f-42c7-4106-998a-f3555218e77d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.582231 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.582658 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.082646258 +0000 UTC m=+143.955272969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.589442 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.592938 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.595441 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.595494 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-ps6pv"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.600386 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.602137 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.615867 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.615922 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.619355 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.625800 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bwvdt"] Feb 20 06:48:48 crc kubenswrapper[5094]: W0220 06:48:48.625993 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1974d27_b923_4a9b_9874_d400df5bd29a.slice/crio-d91c25a804ee025a2aaae437050594ececf533c312e1237c92db87a755fb8c61 WatchSource:0}: Error finding container d91c25a804ee025a2aaae437050594ececf533c312e1237c92db87a755fb8c61: Status 404 returned error can't find the container with id d91c25a804ee025a2aaae437050594ececf533c312e1237c92db87a755fb8c61 Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.647383 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.651604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42pbk\" (UniqueName: \"kubernetes.io/projected/6bba9a34-7bbd-44be-b82d-0a35f8ef288f-kube-api-access-42pbk\") pod \"etcd-operator-b45778765-7wgh7\" (UID: \"6bba9a34-7bbd-44be-b82d-0a35f8ef288f\") " pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.652763 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.666290 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.677117 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.680887 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.683038 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm8lg\" (UniqueName: \"kubernetes.io/projected/55946b30-00e1-4bd4-bd8e-3f5761537a0b-kube-api-access-lm8lg\") pod \"machine-config-server-v68px\" (UID: \"55946b30-00e1-4bd4-bd8e-3f5761537a0b\") " pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.683784 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.683952 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.183924036 +0000 UTC m=+144.056550747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.684823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.685275 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.185258428 +0000 UTC m=+144.057885139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.689416 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.690996 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48jfb\" (UniqueName: \"kubernetes.io/projected/4a51eb16-597c-47dc-bd54-c16c33bde071-kube-api-access-48jfb\") pod \"packageserver-d55dfcdfc-gn7fn\" (UID: \"4a51eb16-597c-47dc-bd54-c16c33bde071\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.695222 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.703225 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5ljv\" (UniqueName: \"kubernetes.io/projected/368766ec-f562-4296-bcdb-4bcda1db6c45-kube-api-access-n5ljv\") pod \"csi-hostpathplugin-tjkwm\" (UID: \"368766ec-f562-4296-bcdb-4bcda1db6c45\") " pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.720448 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.721692 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qksp7\" (UniqueName: \"kubernetes.io/projected/a4c7d510-2730-46e1-b157-6e890e8868e9-kube-api-access-qksp7\") pod \"package-server-manager-789f6589d5-rm75q\" (UID: \"a4c7d510-2730-46e1-b157-6e890e8868e9\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.729165 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.748803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8csvq\" (UniqueName: \"kubernetes.io/projected/6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c-kube-api-access-8csvq\") pod \"service-ca-9c57cc56f-nlpvl\" (UID: \"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c\") " pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.761148 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.771975 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vwzw\" (UniqueName: \"kubernetes.io/projected/810d7855-cab6-4e33-9a5f-6d7bac9f66eb-kube-api-access-2vwzw\") pod \"dns-default-l2hxn\" (UID: \"810d7855-cab6-4e33-9a5f-6d7bac9f66eb\") " pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.775669 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-v68px" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.778244 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndpg9\" (UniqueName: \"kubernetes.io/projected/fc07e658-5bc5-469e-b793-230b7be58f12-kube-api-access-ndpg9\") pod \"ingress-canary-wcwdv\" (UID: \"fc07e658-5bc5-469e-b793-230b7be58f12\") " pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.794695 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.795165 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.29513747 +0000 UTC m=+144.167764181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.795588 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wcwdv" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.798887 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.822468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb99w\" (UniqueName: \"kubernetes.io/projected/1177f137-190b-4563-8a6f-51d7b0d5ca9c-kube-api-access-mb99w\") pod \"catalog-operator-68c6474976-s6brj\" (UID: \"1177f137-190b-4563-8a6f-51d7b0d5ca9c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.829430 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-p4blk"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.869354 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.870655 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" event={"ID":"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c","Type":"ContainerStarted","Data":"48ac16689b00193d6e154a981653d9fe7dd39018c0acc1a7610d05cb116747a3"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.871014 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w7rf2"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.875517 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" event={"ID":"18cc290d-78be-42c6-af5b-3b8b86941eb2","Type":"ContainerStarted","Data":"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.875545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" event={"ID":"18cc290d-78be-42c6-af5b-3b8b86941eb2","Type":"ContainerStarted","Data":"a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.876497 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.879451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9sztr" event={"ID":"a6908ab3-3d33-4e31-b226-b6607f34ee8b","Type":"ContainerStarted","Data":"85eb772686dce2301c9b652a94f5a2c3a5c8e4d2f6ccf3474d25f433f3d05709"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.879478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9sztr" event={"ID":"a6908ab3-3d33-4e31-b226-b6607f34ee8b","Type":"ContainerStarted","Data":"754f8f0aa0397403892aaf1179a93493ac564f3e42342a60f2bc6e650fffabec"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.885143 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.885616 5094 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-fnbl8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.885753 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.894355 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" event={"ID":"e594f26b-0fd6-44a1-93eb-84593591389f","Type":"ContainerStarted","Data":"cb47109c94bbf8a2aacd53b887c83a904e8dcda61778f8c16e4aec4144261c85"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.895483 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" event={"ID":"e594f26b-0fd6-44a1-93eb-84593591389f","Type":"ContainerStarted","Data":"fde96eaf411b08b987ea2a5695c1132631ea62a21b4b7d86b396dda3b9327c24"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.899867 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" event={"ID":"2f348b60-0d81-490e-bfb4-ea32546c995a","Type":"ContainerStarted","Data":"a12ca65cfcd04a2d8d82fbe2a093e225deb8a5bcd4e29b2c59791942f7c1fb4c"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.900219 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:48 crc kubenswrapper[5094]: E0220 06:48:48.900624 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.400578726 +0000 UTC m=+144.273205437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.902410 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" event={"ID":"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5","Type":"ContainerStarted","Data":"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.902737 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" event={"ID":"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5","Type":"ContainerStarted","Data":"94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.903573 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.903889 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" event={"ID":"38e3be97-7374-4b8b-9565-4d60baa02401","Type":"ContainerStarted","Data":"6ede8375995c4d584a09dc3b8461d2484e83e419a1acd3902384b06a59071516"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.906220 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" event={"ID":"5021cb92-f82d-47ee-9978-58e897c354b1","Type":"ContainerStarted","Data":"227cf5962f112730dac2de0eff0ceafd6faf36426c36f96312ee9d29cbf9c1f4"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.911749 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" event={"ID":"d1974d27-b923-4a9b-9874-d400df5bd29a","Type":"ContainerStarted","Data":"d91c25a804ee025a2aaae437050594ececf533c312e1237c92db87a755fb8c61"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.916656 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" event={"ID":"e78b7a6b-91b7-4753-bd82-df9d3ea97291","Type":"ContainerStarted","Data":"bb435a4e0823b71a3c560876c02f7222976e38e69c2fa7d633e72adb567c7fd9"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.921818 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" event={"ID":"f8d0112e-e85e-42f1-b28b-c0c996f36fe0","Type":"ContainerStarted","Data":"2e6ed90365089f3cede1e0c536507ed5619f4a57e64f9a16434e2833a90a9772"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.922058 5094 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qrtpl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.922240 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.933299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" event={"ID":"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2","Type":"ContainerStarted","Data":"e7cdc79c1efb693626d3e88d6ac7b76397826febf3c66d451684341422248fd7"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.933353 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" event={"ID":"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2","Type":"ContainerStarted","Data":"ed76240b0720c9e933d3f95294f5481e871153c1b7425f7867bf46042ce3c096"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.933667 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.992857 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm"] Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.992936 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" event={"ID":"bac53d01-ed38-46a8-ae9e-bfb72e5565a1","Type":"ContainerStarted","Data":"da2c0b1a5ac1bfb2118456e86a925f576435046f08e44bbd1d22ff45a3a68482"} Feb 20 06:48:48 crc kubenswrapper[5094]: I0220 06:48:48.997964 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" event={"ID":"30a7db23-c18c-4bc6-b1b7-97b32a419fbe","Type":"ContainerStarted","Data":"8d2d652617742cefd28875c8fe01f8304463e4e86ed0e44373501a81794caa80"} Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.002493 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.002659 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.502637083 +0000 UTC m=+144.375263794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.002775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.005211 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.505202824 +0000 UTC m=+144.377829535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.014038 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.026053 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-d2l2r"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.033907 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.038414 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.039867 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.042921 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.052006 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:49 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:49 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:49 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.052299 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.068492 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.077905 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.123974 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.124417 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.624385966 +0000 UTC m=+144.497012667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.232806 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.233722 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.733692174 +0000 UTC m=+144.606318885 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.334759 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.335204 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.835186887 +0000 UTC m=+144.707813598 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.418216 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" podStartSLOduration=122.418196113 podStartE2EDuration="2m2.418196113s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:49.416118564 +0000 UTC m=+144.288745275" watchObservedRunningTime="2026-02-20 06:48:49.418196113 +0000 UTC m=+144.290822824" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.440092 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.440491 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:49.940471381 +0000 UTC m=+144.813098092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.541787 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.542232 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.042204719 +0000 UTC m=+144.914831430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.582778 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.586107 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.630574 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vhztc"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.644111 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.644560 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.144540482 +0000 UTC m=+145.017167193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.703881 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-q596q"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.742517 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.746720 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.747194 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.247171973 +0000 UTC m=+145.119798684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.778191 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" podStartSLOduration=122.778165657 podStartE2EDuration="2m2.778165657s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:49.776950708 +0000 UTC m=+144.649577419" watchObservedRunningTime="2026-02-20 06:48:49.778165657 +0000 UTC m=+144.650792368" Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.849078 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.849544 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.349526917 +0000 UTC m=+145.222153628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.950536 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.951286 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.451261125 +0000 UTC m=+145.323887836 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:49 crc kubenswrapper[5094]: I0220 06:48:49.952163 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:49 crc kubenswrapper[5094]: E0220 06:48:49.952537 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.452529325 +0000 UTC m=+145.325156036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: W0220 06:48:50.002136 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca3c622d_9c0e_43f5_a5ce_de2dbbab5f60.slice/crio-c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1 WatchSource:0}: Error finding container c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1: Status 404 returned error can't find the container with id c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1 Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.035833 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9sztr" podStartSLOduration=123.035811668 podStartE2EDuration="2m3.035811668s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.035300906 +0000 UTC m=+144.907927617" watchObservedRunningTime="2026-02-20 06:48:50.035811668 +0000 UTC m=+144.908438379" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.045280 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:50 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:50 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:50 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.045325 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.047311 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d2l2r" event={"ID":"f1faaf31-f0b6-4828-90cc-51de060dc826","Type":"ContainerStarted","Data":"39c2e7ca833f7ef7d89846da7787d3050a9057ef816f78e796884a8eb050d1b8"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.054567 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.055075 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.555060424 +0000 UTC m=+145.427687135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.071556 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" event={"ID":"bac53d01-ed38-46a8-ae9e-bfb72e5565a1","Type":"ContainerStarted","Data":"d1cb1bbeb617bd587b25db9333fc7fb27df8d78e2c95ab8f83863ba53dcbe5ed"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.074814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" event={"ID":"8f8623a7-b3d4-49ad-86c5-40f19adf7b09","Type":"ContainerStarted","Data":"e4130178047d6af3e0ab4ff09453de029b92e5516ad70ed26661873561710385"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.077482 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" event={"ID":"04b9035c-78ce-4d54-859d-48f7853f3f16","Type":"ContainerStarted","Data":"d9b7bdc00087de0003b0a5043af6ab498b1d9d30621c71ee03e559ff8b27fd22"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.084089 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" event={"ID":"9d4a320a-daa4-4bce-9782-5e9880aea226","Type":"ContainerStarted","Data":"6a844a7f76f541126efc8c47b592165b18584966dd0a56577240a4f095455b25"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.114642 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" event={"ID":"3c3bc896-4d90-42f0-92e9-77a7b285e504","Type":"ContainerStarted","Data":"13cdf00e66336afea1889b2ebf66c8dc720a6e204ebd2d05b9680bb58da74e01"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.121802 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" event={"ID":"38d9642e-3788-4e70-8232-138cd84e02dc","Type":"ContainerStarted","Data":"edfcdda3c0dd05b189c70af2eefb8ceb774a6586f9bbfe4bf0bf8eba5d0437ec"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.158559 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.158915 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.658888792 +0000 UTC m=+145.531515503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.221112 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" event={"ID":"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60","Type":"ContainerStarted","Data":"c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.252847 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" event={"ID":"30a7db23-c18c-4bc6-b1b7-97b32a419fbe","Type":"ContainerStarted","Data":"bf1b86b477ef7d9b7bcdd448e8e0069748f9325d93b5f710501c516d3a5a1eaf"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.276806 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.277584 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.777567162 +0000 UTC m=+145.650193873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.290562 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.304758 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tjkwm"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.304899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" event={"ID":"95700d83-436d-43c5-9eb1-381654f43928","Type":"ContainerStarted","Data":"335c5974d95e909d3c1bf3645c6101a019d706448889a79cc5d66153648d38b6"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.318266 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-7k4rg" podStartSLOduration=124.318246506 podStartE2EDuration="2m4.318246506s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.304348126 +0000 UTC m=+145.176974837" watchObservedRunningTime="2026-02-20 06:48:50.318246506 +0000 UTC m=+145.190873217" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.318672 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.319793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" event={"ID":"cbae269e-22bc-484c-ad96-ad61d462a28d","Type":"ContainerStarted","Data":"e8b5a8b902e7d3e3817c89caef4d0547445c42deb98a5819f0326602607bb7ff"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.326877 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" event={"ID":"38e3be97-7374-4b8b-9565-4d60baa02401","Type":"ContainerStarted","Data":"e4b5fdc4272d56740a6fa5f9ad11cf7fbbde3b0835b268c3ac507b39debb72ca"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.333498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" event={"ID":"d8409ac6-b3d2-4af7-b901-f1f7f3dce3b2","Type":"ContainerStarted","Data":"49373e13ecb57c8fe048d20a0d22ca99da0653a9549885af317cf26b90eec2f0"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.347423 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x72n5"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.356956 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-v68px" event={"ID":"55946b30-00e1-4bd4-bd8e-3f5761537a0b","Type":"ContainerStarted","Data":"197208f93ee4a516cae15fc41066c22afb9af5f517a03e1efa9853a344e20e77"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.357402 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.370288 5094 generic.go:334] "Generic (PLEG): container finished" podID="e594f26b-0fd6-44a1-93eb-84593591389f" containerID="cb47109c94bbf8a2aacd53b887c83a904e8dcda61778f8c16e4aec4144261c85" exitCode=0 Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.372073 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" event={"ID":"e594f26b-0fd6-44a1-93eb-84593591389f","Type":"ContainerDied","Data":"cb47109c94bbf8a2aacd53b887c83a904e8dcda61778f8c16e4aec4144261c85"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.377627 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.379296 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" event={"ID":"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c","Type":"ContainerStarted","Data":"756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734"} Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.379744 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.87972656 +0000 UTC m=+145.752353271 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.380024 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.396469 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" event={"ID":"f8d0112e-e85e-42f1-b28b-c0c996f36fe0","Type":"ContainerStarted","Data":"4a25127fcda14cf3e269b7884f247e76e7958e092869779a7319a4da122552f1"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.400972 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" event={"ID":"806ba791-714c-4d13-b595-d4f6ccf06aea","Type":"ContainerStarted","Data":"2cba69ee98eb39e06e5e23c24335f67d934d7ddd307c2f258f0da6b72887c796"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.414093 5094 csr.go:261] certificate signing request csr-hw6gz is approved, waiting to be issued Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.414125 5094 csr.go:257] certificate signing request csr-hw6gz is issued Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.450995 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" event={"ID":"5f20c574-b730-4bd8-97d1-7751eb7968d4","Type":"ContainerStarted","Data":"082f0f7066d0aab2b6362a9588af4682178c61915c83e74167c3503683d40eba"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.454014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" event={"ID":"2f348b60-0d81-490e-bfb4-ea32546c995a","Type":"ContainerStarted","Data":"fad35881a9e75d02e1ebd99e539d4791ed3e4cc6d7bb713a9c9ef957e003c1b4"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.478061 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" event={"ID":"7b5c64ae-5f80-4e35-91dc-48163991b63d","Type":"ContainerStarted","Data":"39214b864a40c936cc1d08c8d3c2e33aaa966fe4b56fdd3a03d19e9802e84ad6"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.478478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.478891 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.978872388 +0000 UTC m=+145.851499099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.479291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.486089 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:50.986073819 +0000 UTC m=+145.858700530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.486775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" event={"ID":"5021cb92-f82d-47ee-9978-58e897c354b1","Type":"ContainerStarted","Data":"96d9f4a3e51cdc054f5fbb2e7cc16f9f5e938c18d9e211245e56c56c47e1c814"} Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.493669 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.500538 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.580436 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.580982 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.080967116 +0000 UTC m=+145.953593827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.648446 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.722143 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.727385 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.227365753 +0000 UTC m=+146.099992464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: W0220 06:48:50.752093 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4797e67f_42c7_4106_998a_f3555218e77d.slice/crio-bb1740f53cf5bbe968f073c190551df76652e5c1f042b89eed337d39ef2f652b WatchSource:0}: Error finding container bb1740f53cf5bbe968f073c190551df76652e5c1f042b89eed337d39ef2f652b: Status 404 returned error can't find the container with id bb1740f53cf5bbe968f073c190551df76652e5c1f042b89eed337d39ef2f652b Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.775765 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.775853 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.810693 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wcwdv"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.830544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.830971 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.330949686 +0000 UTC m=+146.203576397 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.858108 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj788" podStartSLOduration=124.858086858 podStartE2EDuration="2m4.858086858s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.817975729 +0000 UTC m=+145.690602430" watchObservedRunningTime="2026-02-20 06:48:50.858086858 +0000 UTC m=+145.730713569" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.858311 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bwvdt" podStartSLOduration=124.858307053 podStartE2EDuration="2m4.858307053s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.854971954 +0000 UTC m=+145.727598665" watchObservedRunningTime="2026-02-20 06:48:50.858307053 +0000 UTC m=+145.730933764" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.862630 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-7wgh7"] Feb 20 06:48:50 crc kubenswrapper[5094]: W0220 06:48:50.878505 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74a0c7b_3ad6_4f59_b4ff_33e1209c3116.slice/crio-a7f20fdbdab0cc5f6c68e27c33a34037b6042872ed0e49f45e04190788dc4434 WatchSource:0}: Error finding container a7f20fdbdab0cc5f6c68e27c33a34037b6042872ed0e49f45e04190788dc4434: Status 404 returned error can't find the container with id a7f20fdbdab0cc5f6c68e27c33a34037b6042872ed0e49f45e04190788dc4434 Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.895902 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-wzftj" podStartSLOduration=123.895882283 podStartE2EDuration="2m3.895882283s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.894354967 +0000 UTC m=+145.766981678" watchObservedRunningTime="2026-02-20 06:48:50.895882283 +0000 UTC m=+145.768508994" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.916690 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.916932 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-nlpvl"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.935804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:50 crc kubenswrapper[5094]: E0220 06:48:50.936295 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.436277779 +0000 UTC m=+146.308904490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.938256 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.955241 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q"] Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.959173 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" podStartSLOduration=124.959149231 podStartE2EDuration="2m4.959149231s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:50.958515096 +0000 UTC m=+145.831141807" watchObservedRunningTime="2026-02-20 06:48:50.959149231 +0000 UTC m=+145.831775942" Feb 20 06:48:50 crc kubenswrapper[5094]: I0220 06:48:50.972353 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l2hxn"] Feb 20 06:48:51 crc kubenswrapper[5094]: W0220 06:48:51.034009 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bba9a34_7bbd_44be_b82d_0a35f8ef288f.slice/crio-2803b37c86bc2d9dbd6227a403c4934c113a8ced7926e75c399d606457b5de0b WatchSource:0}: Error finding container 2803b37c86bc2d9dbd6227a403c4934c113a8ced7926e75c399d606457b5de0b: Status 404 returned error can't find the container with id 2803b37c86bc2d9dbd6227a403c4934c113a8ced7926e75c399d606457b5de0b Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.039981 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.040479 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.540462776 +0000 UTC m=+146.413089487 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.058938 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:51 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:51 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:51 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.059000 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.141372 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.141723 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.641695704 +0000 UTC m=+146.514322415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.243333 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.245146 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.745118933 +0000 UTC m=+146.617745644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.349922 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.350450 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.850437666 +0000 UTC m=+146.723064377 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.416081 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-20 06:43:50 +0000 UTC, rotation deadline is 2027-01-10 02:55:08.296523138 +0000 UTC Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.416133 5094 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7772h6m16.880393726s for next certificate rotation Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.457645 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.458074 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:51.958040485 +0000 UTC m=+146.830667196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.509375 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" event={"ID":"1177f137-190b-4563-8a6f-51d7b0d5ca9c","Type":"ContainerStarted","Data":"bf02145176698932aa6290346dc826656b111284edfa77724b46c1b2ded7ed25"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.537235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" event={"ID":"4a51eb16-597c-47dc-bd54-c16c33bde071","Type":"ContainerStarted","Data":"23a6d21bc9f670c58daac935b4cf34dca1e21d756f0ef871577799c139f8fb5e"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.548064 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-v68px" event={"ID":"55946b30-00e1-4bd4-bd8e-3f5761537a0b","Type":"ContainerStarted","Data":"840aef56862c43e15d0ba9394df0bc3dd1070841c75042790202d3f243016924"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.560129 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.560534 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.060510191 +0000 UTC m=+146.933136902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.566377 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" event={"ID":"5021cb92-f82d-47ee-9978-58e897c354b1","Type":"ContainerStarted","Data":"2b2ded0bd00b9b022717edbca5dcc4a046119966453331ebc23620ffd9ed7974"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.572600 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-v68px" podStartSLOduration=6.572559325 podStartE2EDuration="6.572559325s" podCreationTimestamp="2026-02-20 06:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.569074033 +0000 UTC m=+146.441700734" watchObservedRunningTime="2026-02-20 06:48:51.572559325 +0000 UTC m=+146.445186036" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.621045 5094 generic.go:334] "Generic (PLEG): container finished" podID="7b5c64ae-5f80-4e35-91dc-48163991b63d" containerID="719cfe3070710391ae7ff475b31d0d0d3486957b9eff677db33007a93967ce37" exitCode=0 Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.621150 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" event={"ID":"7b5c64ae-5f80-4e35-91dc-48163991b63d","Type":"ContainerDied","Data":"719cfe3070710391ae7ff475b31d0d0d3486957b9eff677db33007a93967ce37"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.639019 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-j2mmj" podStartSLOduration=124.638987799 podStartE2EDuration="2m4.638987799s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.61538476 +0000 UTC m=+146.488011481" watchObservedRunningTime="2026-02-20 06:48:51.638987799 +0000 UTC m=+146.511614510" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.656807 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" event={"ID":"a072f264-8eef-49ff-804c-fc584b41175c","Type":"ContainerStarted","Data":"8df1751deaf4b25423b8ffb2da16475453a0ad90ed8820c56ff1d590f14fbf0f"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.664101 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.664377 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.164344609 +0000 UTC m=+147.036971320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.664659 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.668778 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.168757274 +0000 UTC m=+147.041383985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.673804 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-d2l2r" event={"ID":"f1faaf31-f0b6-4828-90cc-51de060dc826","Type":"ContainerStarted","Data":"2a68290fa1c4ca30a26485e5b789d657f8d3dc6f5892aa3b43625cdefd1e9e27"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.674859 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.679909 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.679949 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.734025 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-d2l2r" podStartSLOduration=124.733998869 podStartE2EDuration="2m4.733998869s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.727323241 +0000 UTC m=+146.599949962" watchObservedRunningTime="2026-02-20 06:48:51.733998869 +0000 UTC m=+146.606625580" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.767673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" event={"ID":"3c3bc896-4d90-42f0-92e9-77a7b285e504","Type":"ContainerStarted","Data":"4dad789791bce622abb9ee22f3228dec0cc564cb39baf6ba060ecc65e94f54c7"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.768430 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.769280 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.269255904 +0000 UTC m=+147.141882615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.784338 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" event={"ID":"04b9035c-78ce-4d54-859d-48f7853f3f16","Type":"ContainerStarted","Data":"83565444373c2a0fa672cd6ee113621d4b694e3092ba567c877c9e22ef97aa72"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.790899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" event={"ID":"9d4a320a-daa4-4bce-9782-5e9880aea226","Type":"ContainerStarted","Data":"ee8420d5ace56cc69ef966f5d6519926a22add67977f2bfc4d794cc0072b4163"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.807324 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vhztc" podStartSLOduration=124.807304314 podStartE2EDuration="2m4.807304314s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.806445774 +0000 UTC m=+146.679072485" watchObservedRunningTime="2026-02-20 06:48:51.807304314 +0000 UTC m=+146.679931025" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.815008 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"661750a3cc39a7eb8cd7dd9ec0f8265614f2d44bceeb6672bb22df6e24aa3240"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.870058 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.870627 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.370601933 +0000 UTC m=+147.243228644 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.874934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" event={"ID":"cbae269e-22bc-484c-ad96-ad61d462a28d","Type":"ContainerStarted","Data":"81a34f90029b69885d8fb9a456a644431c73a92d9c187ca4a2f3ebd2ae84c509"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.874995 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.885035 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" event={"ID":"4797e67f-42c7-4106-998a-f3555218e77d","Type":"ContainerStarted","Data":"bb1740f53cf5bbe968f073c190551df76652e5c1f042b89eed337d39ef2f652b"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.890255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" event={"ID":"b32eb7f4-e823-4b71-9606-d3dee9f247fd","Type":"ContainerStarted","Data":"265402085e76d0cbb939ebfe62c7c80946892bfea0b80dadf2850d32502ab2c2"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.890317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" event={"ID":"b32eb7f4-e823-4b71-9606-d3dee9f247fd","Type":"ContainerStarted","Data":"1015b645cb5c31550a6ff0c30d96eab4a332985e6e6de7855f9033e0fb7ad816"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.896086 5094 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-57pxl container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.896143 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" podUID="cbae269e-22bc-484c-ad96-ad61d462a28d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.914443 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" podStartSLOduration=124.914422971 podStartE2EDuration="2m4.914422971s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.912662049 +0000 UTC m=+146.785288760" watchObservedRunningTime="2026-02-20 06:48:51.914422971 +0000 UTC m=+146.787049682" Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.953670 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" event={"ID":"e78b7a6b-91b7-4753-bd82-df9d3ea97291","Type":"ContainerStarted","Data":"88c64d8bec3f92cdba2164ac30b7327552174624512758460a6aded0595cbf34"} Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.970759 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:51 crc kubenswrapper[5094]: E0220 06:48:51.972893 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.472875535 +0000 UTC m=+147.345502246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:51 crc kubenswrapper[5094]: I0220 06:48:51.980466 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-jjblc" podStartSLOduration=124.980441064 podStartE2EDuration="2m4.980441064s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:51.967872007 +0000 UTC m=+146.840498718" watchObservedRunningTime="2026-02-20 06:48:51.980441064 +0000 UTC m=+146.853067775" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.024195 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c2mfb" podStartSLOduration=125.02417096 podStartE2EDuration="2m5.02417096s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.020668986 +0000 UTC m=+146.893295697" watchObservedRunningTime="2026-02-20 06:48:52.02417096 +0000 UTC m=+146.896797691" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.026141 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" event={"ID":"bac53d01-ed38-46a8-ae9e-bfb72e5565a1","Type":"ContainerStarted","Data":"8321c815f13d3667834d96eec3965c5f2787df004384a0ce9a6ec23d4b4869e4"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.029181 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" event={"ID":"2f348b60-0d81-490e-bfb4-ea32546c995a","Type":"ContainerStarted","Data":"5cdc5822740b7095d1974c19dbccda507767ab0c1fa72d48fbc6d25078c0b34d"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.040748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" event={"ID":"5f20c574-b730-4bd8-97d1-7751eb7968d4","Type":"ContainerStarted","Data":"b0a2acc52e341f79f63d9a1995205da3048e7b090d068e86900dee98e8bd9b48"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.045867 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:52 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:52 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:52 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.045912 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.065183 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" event={"ID":"7a75a178-3fa3-4be7-b29f-e1f01dc859a4","Type":"ContainerStarted","Data":"6d302e0e8c1edc759fea6176bad4f60407b1ec7aa0ae6161b7a98608854ada3d"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.072095 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.072402 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.572391121 +0000 UTC m=+147.445017832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.097275 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" event={"ID":"38d9642e-3788-4e70-8232-138cd84e02dc","Type":"ContainerStarted","Data":"34d6c7aa37fc8eb5eeda4b0cea58bd93dc213b767611900685a4c35eeeddad03"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.129722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wcwdv" event={"ID":"fc07e658-5bc5-469e-b793-230b7be58f12","Type":"ContainerStarted","Data":"3139a68b48fcef2d120274aed4963d2f72e77ba6fc1d7a2cb360251ad2f8742a"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.129842 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-ps6pv" podStartSLOduration=125.129824641 podStartE2EDuration="2m5.129824641s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.124882214 +0000 UTC m=+146.997508925" watchObservedRunningTime="2026-02-20 06:48:52.129824641 +0000 UTC m=+147.002451352" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.134096 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7hljp" podStartSLOduration=125.134080102 podStartE2EDuration="2m5.134080102s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.071061959 +0000 UTC m=+146.943688670" watchObservedRunningTime="2026-02-20 06:48:52.134080102 +0000 UTC m=+147.006706813" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.173471 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.175475 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.675435511 +0000 UTC m=+147.548062222 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.208911 5094 generic.go:334] "Generic (PLEG): container finished" podID="8f8623a7-b3d4-49ad-86c5-40f19adf7b09" containerID="ee3a87c537aa4fca93d8f998dea856908205a9a56132c65c958d2298537ef172" exitCode=0 Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.209010 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" event={"ID":"8f8623a7-b3d4-49ad-86c5-40f19adf7b09","Type":"ContainerDied","Data":"ee3a87c537aa4fca93d8f998dea856908205a9a56132c65c958d2298537ef172"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.215449 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vs599" podStartSLOduration=125.215422929 podStartE2EDuration="2m5.215422929s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.192213259 +0000 UTC m=+147.064839970" watchObservedRunningTime="2026-02-20 06:48:52.215422929 +0000 UTC m=+147.088049640" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.246573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" event={"ID":"d1974d27-b923-4a9b-9874-d400df5bd29a","Type":"ContainerStarted","Data":"b51efc26bb16dfcde63b61af8f545647be6a0ba0478045391f6cf4ebfce8d61f"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.259915 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wcwdv" podStartSLOduration=7.259879761 podStartE2EDuration="7.259879761s" podCreationTimestamp="2026-02-20 06:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.246231318 +0000 UTC m=+147.118858019" watchObservedRunningTime="2026-02-20 06:48:52.259879761 +0000 UTC m=+147.132506472" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.269388 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" event={"ID":"95700d83-436d-43c5-9eb1-381654f43928","Type":"ContainerStarted","Data":"acbb92fa881d8208be98d779d340579e572e5c0e423b335c0c79f6ab35419cc5"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.270652 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.271830 5094 patch_prober.go:28] interesting pod/console-operator-58897d9998-w7rf2 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.271877 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" podUID="95700d83-436d-43c5-9eb1-381654f43928" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.282003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.282748 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.782734972 +0000 UTC m=+147.655361683 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.348727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shq4j" event={"ID":"e130287f-996d-4ab0-8c12-351bf8d21df5","Type":"ContainerStarted","Data":"9e9ebe837e9f43c76e6f912fde9f9a4d76af0096fe554a67909ac3cf138a323a"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.366453 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-znrdm" podStartSLOduration=125.366432884 podStartE2EDuration="2m5.366432884s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.301296882 +0000 UTC m=+147.173923593" watchObservedRunningTime="2026-02-20 06:48:52.366432884 +0000 UTC m=+147.239059595" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.387395 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.392423 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:52.892381908 +0000 UTC m=+147.765008619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.394235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2hxn" event={"ID":"810d7855-cab6-4e33-9a5f-6d7bac9f66eb","Type":"ContainerStarted","Data":"26b003bdf0aedeeb11371c691d87fa14db0e6378b4677fe005cfb4a29e342bd9"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.399850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" event={"ID":"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c","Type":"ContainerStarted","Data":"1e7a66379829ca3facd3965ad9642612d26bed5b2aa8d5bd79cbfffc9fb31129"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.441912 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" podStartSLOduration=125.44187731 podStartE2EDuration="2m5.44187731s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.433451451 +0000 UTC m=+147.306078162" watchObservedRunningTime="2026-02-20 06:48:52.44187731 +0000 UTC m=+147.314504021" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.460083 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" event={"ID":"a4c7d510-2730-46e1-b157-6e890e8868e9","Type":"ContainerStarted","Data":"8c84370dce58890b84bc2db3bb249922847dd52c6f58e7de76c62408a999789c"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.490927 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" event={"ID":"806ba791-714c-4d13-b595-d4f6ccf06aea","Type":"ContainerStarted","Data":"97bfbe799eec96272d8e9a6b7afb335847d35846686c6b7385ed8fc78aa4aec5"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.494038 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.502264 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.001780659 +0000 UTC m=+147.874407370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.522222 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" event={"ID":"6bba9a34-7bbd-44be-b82d-0a35f8ef288f","Type":"ContainerStarted","Data":"2803b37c86bc2d9dbd6227a403c4934c113a8ced7926e75c399d606457b5de0b"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.530620 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-ddvpd" podStartSLOduration=125.530588961 podStartE2EDuration="2m5.530588961s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.47228851 +0000 UTC m=+147.344915221" watchObservedRunningTime="2026-02-20 06:48:52.530588961 +0000 UTC m=+147.403215672" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.545313 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" event={"ID":"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60","Type":"ContainerStarted","Data":"ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.555695 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.578170 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" event={"ID":"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116","Type":"ContainerStarted","Data":"a7f20fdbdab0cc5f6c68e27c33a34037b6042872ed0e49f45e04190788dc4434"} Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.580619 5094 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-45dt8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.580651 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.595811 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-shq4j" podStartSLOduration=125.595792895 podStartE2EDuration="2m5.595792895s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.521308151 +0000 UTC m=+147.393934872" watchObservedRunningTime="2026-02-20 06:48:52.595792895 +0000 UTC m=+147.468419606" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.604302 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.605910 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.105892364 +0000 UTC m=+147.978519075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.622398 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" podStartSLOduration=125.622374714 podStartE2EDuration="2m5.622374714s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.580898683 +0000 UTC m=+147.453525394" watchObservedRunningTime="2026-02-20 06:48:52.622374714 +0000 UTC m=+147.495001425" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.623087 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" podStartSLOduration=125.623082331 podStartE2EDuration="2m5.623082331s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.622102128 +0000 UTC m=+147.494728829" watchObservedRunningTime="2026-02-20 06:48:52.623082331 +0000 UTC m=+147.495709042" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.663469 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" podStartSLOduration=125.663446077 podStartE2EDuration="2m5.663446077s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:52.662407352 +0000 UTC m=+147.535034063" watchObservedRunningTime="2026-02-20 06:48:52.663446077 +0000 UTC m=+147.536072788" Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.713807 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.714321 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.214304921 +0000 UTC m=+148.086931632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.814508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.815008 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.314993766 +0000 UTC m=+148.187620477 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.916006 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:52 crc kubenswrapper[5094]: E0220 06:48:52.916439 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.416419137 +0000 UTC m=+148.289045848 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:52 crc kubenswrapper[5094]: I0220 06:48:52.961992 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.019487 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.020411 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.520387308 +0000 UTC m=+148.393014009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.043859 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:53 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:53 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:53 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.043926 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.122717 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.123305 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.623289395 +0000 UTC m=+148.495916106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.224312 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.224538 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.724506222 +0000 UTC m=+148.597132933 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.224844 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.225246 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.72523341 +0000 UTC m=+148.597860121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.326636 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.326827 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.826775383 +0000 UTC m=+148.699402094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.327041 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.327507 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.827486991 +0000 UTC m=+148.700113702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.428406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.428609 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.928530533 +0000 UTC m=+148.801157244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.429141 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.429504 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:53.929491126 +0000 UTC m=+148.802117837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.530825 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.531054 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.03102137 +0000 UTC m=+148.903648081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.531377 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.531783 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.031775118 +0000 UTC m=+148.904401829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.585752 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-nlpvl" event={"ID":"6c0d9e8f-609b-42d6-a9b7-9b8dcfa3655c","Type":"ContainerStarted","Data":"8d9bec351c0fc0710acb33b8eb536c118b9ecfcdb3f10ae2fa68086a118dd1b8"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.587659 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" event={"ID":"4a51eb16-597c-47dc-bd54-c16c33bde071","Type":"ContainerStarted","Data":"c22aae595dca26c79cc3480d16d8c1f950ef601d9152c428bb76b7bbe872ddb7"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.587930 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.589648 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" event={"ID":"04b9035c-78ce-4d54-859d-48f7853f3f16","Type":"ContainerStarted","Data":"a9083153b32dccca3a1a51f54832e7b5d9b64685ff7b9c07c5a26dcd3375bff8"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.591123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" event={"ID":"6bba9a34-7bbd-44be-b82d-0a35f8ef288f","Type":"ContainerStarted","Data":"ad945fcd7b5fc2231897504821dcad743519120a4408fb921971af86ef379d72"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.592346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"a7aa8f24b3d5c02da7d0d9ade1d6c71e80edc6d5d094e75e7faf8acb9f27b369"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.594762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shq4j" event={"ID":"e130287f-996d-4ab0-8c12-351bf8d21df5","Type":"ContainerStarted","Data":"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.597722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" event={"ID":"7a75a178-3fa3-4be7-b29f-e1f01dc859a4","Type":"ContainerStarted","Data":"c1b9527c261d3d942362984ecc003014543b74961a1ad7b377199b42cfe481fd"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.597835 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" event={"ID":"7a75a178-3fa3-4be7-b29f-e1f01dc859a4","Type":"ContainerStarted","Data":"405a2793231e7c6a02ddcec6356e5472cf96043d144d133078ccfa674f0b6e7d"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.601288 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" event={"ID":"e594f26b-0fd6-44a1-93eb-84593591389f","Type":"ContainerStarted","Data":"e07f8d0b3df405b8ca7123df2b4d330a242cea3e2391d205143c1736e74eed3f"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.601817 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.603636 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" event={"ID":"8f8623a7-b3d4-49ad-86c5-40f19adf7b09","Type":"ContainerStarted","Data":"1a59ccd6bbe9f1a540d0c1a933e3878040fc5896cf3d2a910619406b037a63b4"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.606451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" event={"ID":"7b5c64ae-5f80-4e35-91dc-48163991b63d","Type":"ContainerStarted","Data":"fdae0bc7ef05a432f7a20ccaf763474980635bd783e9a84abb8a1efd22c2e19a"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.606781 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" event={"ID":"7b5c64ae-5f80-4e35-91dc-48163991b63d","Type":"ContainerStarted","Data":"8f62c0f78e34ee5f42c8290c19850b1d8530b909852489e916c02461c7141c0e"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.608538 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" event={"ID":"a4c7d510-2730-46e1-b157-6e890e8868e9","Type":"ContainerStarted","Data":"25e8f1f02a279669e9900fa8c25d826db833f42e74e3a27e72fe5628a8c009ca"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.608639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" event={"ID":"a4c7d510-2730-46e1-b157-6e890e8868e9","Type":"ContainerStarted","Data":"5554fd8851fba78578cd37c32041bc8b94449f93717dfa1fafad97b26ea558b0"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.609156 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.611035 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2hxn" event={"ID":"810d7855-cab6-4e33-9a5f-6d7bac9f66eb","Type":"ContainerStarted","Data":"54e1f7eebaa262f720d08801968d75fa654a376365c50075aea8306d86adaca2"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.611155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l2hxn" event={"ID":"810d7855-cab6-4e33-9a5f-6d7bac9f66eb","Type":"ContainerStarted","Data":"a4a15bfab14b5ab990972032c11c59591a4046aedf93d0d5979bb12442b17255"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.611659 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-l2hxn" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.615763 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" event={"ID":"9d4a320a-daa4-4bce-9782-5e9880aea226","Type":"ContainerStarted","Data":"b106a2e1c14cd7ee8c25551b0a146cc90e8abcc581533a5f380abd323e3b3ffc"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.617736 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" event={"ID":"4797e67f-42c7-4106-998a-f3555218e77d","Type":"ContainerStarted","Data":"a5febfdf6ff13b73885ae68771975d75b338ef6c32185a51f9e6678e4521e008"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.619350 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" podStartSLOduration=126.619335421 podStartE2EDuration="2m6.619335421s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.612604461 +0000 UTC m=+148.485231172" watchObservedRunningTime="2026-02-20 06:48:53.619335421 +0000 UTC m=+148.491962142" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.619529 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" event={"ID":"a072f264-8eef-49ff-804c-fc584b41175c","Type":"ContainerStarted","Data":"b4f6631b74da908d11b454e73b916bf4df1f856d6688d805ff2095c0e2fb34d3"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.619585 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" event={"ID":"a072f264-8eef-49ff-804c-fc584b41175c","Type":"ContainerStarted","Data":"539612e997caaec2d06de1fe954cd9b6cb5d14bfdcb52922ce688d299da84cf3"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.625737 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" event={"ID":"1177f137-190b-4563-8a6f-51d7b0d5ca9c","Type":"ContainerStarted","Data":"3fc35bcf407511bee7a9e8b7075c2f9ea4133bcc7d328b3cdc64dcc8dbbd4ad4"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.625794 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.628479 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" event={"ID":"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116","Type":"ContainerStarted","Data":"36f38721fd0a0d7d7219e3717cfb74f975fa115ab06ab7f2670028cdc038f21f"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.628660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" event={"ID":"d74a0c7b-3ad6-4f59-b4ff-33e1209c3116","Type":"ContainerStarted","Data":"564bd1116ccb908b0682633fbde86bceed7a68881dac681bc15ae456085179cf"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.632537 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.634621 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.134587582 +0000 UTC m=+149.007214293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.636176 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wcwdv" event={"ID":"fc07e658-5bc5-469e-b793-230b7be58f12","Type":"ContainerStarted","Data":"595d2440e384fdef9f38c9b5f5a4f8d8b79d0f87121891fbe8926f8e447d7447"} Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.643859 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.643938 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.657318 5094 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-45dt8 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.657407 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.662754 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-57pxl" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.695881 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" podStartSLOduration=126.695858473 podStartE2EDuration="2m6.695858473s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.69487533 +0000 UTC m=+148.567502041" watchObservedRunningTime="2026-02-20 06:48:53.695858473 +0000 UTC m=+148.568485184" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.700135 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-w7rf2" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.711960 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.738049 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.751894 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" podStartSLOduration=126.751867849 podStartE2EDuration="2m6.751867849s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.738731679 +0000 UTC m=+148.611358390" watchObservedRunningTime="2026-02-20 06:48:53.751867849 +0000 UTC m=+148.624494560" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.753227 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.253204951 +0000 UTC m=+149.125831652 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.794637 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-7wgh7" podStartSLOduration=126.794614021 podStartE2EDuration="2m6.794614021s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.791072468 +0000 UTC m=+148.663699179" watchObservedRunningTime="2026-02-20 06:48:53.794614021 +0000 UTC m=+148.667240732" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.822511 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-x72n5" podStartSLOduration=126.822489471 podStartE2EDuration="2m6.822489471s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.815676041 +0000 UTC m=+148.688302752" watchObservedRunningTime="2026-02-20 06:48:53.822489471 +0000 UTC m=+148.695116182" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.843997 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.844814 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.344789249 +0000 UTC m=+149.217415960 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.849008 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l2hxn" podStartSLOduration=8.848996029 podStartE2EDuration="8.848996029s" podCreationTimestamp="2026-02-20 06:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.847387311 +0000 UTC m=+148.720014022" watchObservedRunningTime="2026-02-20 06:48:53.848996029 +0000 UTC m=+148.721622730" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.946738 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:53 crc kubenswrapper[5094]: E0220 06:48:53.947135 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.447122582 +0000 UTC m=+149.319749293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.967048 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" podStartSLOduration=126.967023924 podStartE2EDuration="2m6.967023924s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.898086391 +0000 UTC m=+148.770713102" watchObservedRunningTime="2026-02-20 06:48:53.967023924 +0000 UTC m=+148.839650635" Feb 20 06:48:53 crc kubenswrapper[5094]: I0220 06:48:53.967160 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" podStartSLOduration=127.967157338 podStartE2EDuration="2m7.967157338s" podCreationTimestamp="2026-02-20 06:46:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:53.960766806 +0000 UTC m=+148.833393517" watchObservedRunningTime="2026-02-20 06:48:53.967157338 +0000 UTC m=+148.839784039" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.021058 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-rwdb7" podStartSLOduration=127.021035953 podStartE2EDuration="2m7.021035953s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.019303092 +0000 UTC m=+148.891929803" watchObservedRunningTime="2026-02-20 06:48:54.021035953 +0000 UTC m=+148.893662664" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.048390 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.048836 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.548819781 +0000 UTC m=+149.421446492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.048981 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:54 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:54 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:54 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.049047 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.093159 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-9z5jc" podStartSLOduration=127.093142981 podStartE2EDuration="2m7.093142981s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.091159953 +0000 UTC m=+148.963786664" watchObservedRunningTime="2026-02-20 06:48:54.093142981 +0000 UTC m=+148.965769692" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.155483 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.155863 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.655850055 +0000 UTC m=+149.528476766 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.252521 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-q596q" podStartSLOduration=127.252496573 podStartE2EDuration="2m7.252496573s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.250405744 +0000 UTC m=+149.123032455" watchObservedRunningTime="2026-02-20 06:48:54.252496573 +0000 UTC m=+149.125123284" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.257044 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.257591 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.757562224 +0000 UTC m=+149.630188935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.358744 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.359689 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.859669301 +0000 UTC m=+149.732296012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.363493 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-gn7fn" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.409615 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-jqgq2" podStartSLOduration=127.409588564 podStartE2EDuration="2m7.409588564s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.320870252 +0000 UTC m=+149.193496963" watchObservedRunningTime="2026-02-20 06:48:54.409588564 +0000 UTC m=+149.282215275" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.460468 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.461100 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:54.961078992 +0000 UTC m=+149.833705703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.483604 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-s2xnj" podStartSLOduration=127.483581826 podStartE2EDuration="2m7.483581826s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.411096619 +0000 UTC m=+149.283723330" watchObservedRunningTime="2026-02-20 06:48:54.483581826 +0000 UTC m=+149.356208537" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.553301 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-s6brj" podStartSLOduration=127.553281576 podStartE2EDuration="2m7.553281576s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:54.485538512 +0000 UTC m=+149.358165223" watchObservedRunningTime="2026-02-20 06:48:54.553281576 +0000 UTC m=+149.425908277" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.563917 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.564319 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.064307497 +0000 UTC m=+149.936934208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.666977 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.668688 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.168664438 +0000 UTC m=+150.041291149 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.677870 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"50733f8ae42d0ce2ee3b276ab62846f77e5e69e3728251894751a5554c4017e3"} Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.679406 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.679443 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.695257 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.698971 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-p9fck" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.770973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.776175 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.276158623 +0000 UTC m=+150.148785564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.877788 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.878574 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.378555478 +0000 UTC m=+150.251182189 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.980340 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.980718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.980848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:54 crc kubenswrapper[5094]: E0220 06:48:54.981211 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.481187738 +0000 UTC m=+150.353814449 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.981986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.982353 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.985354 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.992784 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:54 crc kubenswrapper[5094]: I0220 06:48:54.993407 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.004893 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.006353 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.011010 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.023752 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.026017 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.051173 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:55 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:55 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:55 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.051243 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.084002 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.583979522 +0000 UTC m=+150.456606233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.083901 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.084458 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.084477 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.084867 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.084982 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.085164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.085520 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.585503478 +0000 UTC m=+150.458130189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.179166 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.186487 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.186945 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.187055 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.187134 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.187734 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.687713248 +0000 UTC m=+150.560339949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.188322 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.188366 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.191191 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.191900 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.192278 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.197231 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.234775 5094 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.250735 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.256531 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") pod \"certified-operators-7zj2v\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.288256 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.288902 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.289011 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.289099 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.289430 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.789408657 +0000 UTC m=+150.662035368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.338958 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.391120 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.391450 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.391542 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.391584 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.392467 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.392557 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.892530299 +0000 UTC m=+150.765157010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.393298 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.394889 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.396291 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.415273 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.484822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") pod \"community-operators-c94fj\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.497347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.497433 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.497476 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.497508 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.497834 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:55.997821862 +0000 UTC m=+150.870448573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.514327 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.596985 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.598064 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.598783 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599130 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599151 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599660 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.599739 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.099726115 +0000 UTC m=+150.972352826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.599939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.624505 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") pod \"certified-operators-d5dnl\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.633807 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.666366 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.702586 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.702631 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.702693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.702739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.703389 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.203368689 +0000 UTC m=+151.075995400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.764534 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"3c2807259a1e3547abf1555a7d4f410742bfe18a3992ceee40a375b7ab31491a"} Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.764584 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" event={"ID":"368766ec-f562-4296-bcdb-4bcda1db6c45","Type":"ContainerStarted","Data":"1eced95afed87dd49973c9a7b8fecabb40b5b049ad20de866ba6fc4bc08b0fc0"} Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.810514 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.810805 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.810970 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.811022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.812072 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.312056542 +0000 UTC m=+151.184683253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.812721 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.813276 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tjkwm" podStartSLOduration=10.813257201 podStartE2EDuration="10.813257201s" podCreationTimestamp="2026-02-20 06:48:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:55.810739381 +0000 UTC m=+150.683366092" watchObservedRunningTime="2026-02-20 06:48:55.813257201 +0000 UTC m=+150.685883912" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.815019 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.851492 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") pod \"community-operators-85kbx\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.913234 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:55 crc kubenswrapper[5094]: E0220 06:48:55.913592 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.413580046 +0000 UTC m=+151.286206757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mfphn" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:55 crc kubenswrapper[5094]: I0220 06:48:55.996078 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.014537 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:56 crc kubenswrapper[5094]: E0220 06:48:56.015133 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-20 06:48:56.51511709 +0000 UTC m=+151.387743801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.040195 5094 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-20T06:48:55.234801403Z","Handler":null,"Name":""} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.050120 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:56 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:56 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:56 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.050167 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.050711 5094 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.050745 5094 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.117471 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.119979 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.120011 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.127352 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-bbfb98feecd2f0850dad905fc0957730e7d8b0af9c2e5ded6695aebe2f28369f WatchSource:0}: Error finding container bbfb98feecd2f0850dad905fc0957730e7d8b0af9c2e5ded6695aebe2f28369f: Status 404 returned error can't find the container with id bbfb98feecd2f0850dad905fc0957730e7d8b0af9c2e5ded6695aebe2f28369f Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.205206 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-38ed73c2ce96afb9435212081fee98c3fa6dc707df7faad283e894a5481c32f9 WatchSource:0}: Error finding container 38ed73c2ce96afb9435212081fee98c3fa6dc707df7faad283e894a5481c32f9: Status 404 returned error can't find the container with id 38ed73c2ce96afb9435212081fee98c3fa6dc707df7faad283e894a5481c32f9 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.213073 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mfphn\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.358807 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.386941 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.409840 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94d239c4_71e0_42e2_a2e7_69acd87b5986.slice/crio-d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266 WatchSource:0}: Error finding container d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266: Status 404 returned error can't find the container with id d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.418965 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.435355 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.460073 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.492815 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.495129 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.510685 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66c35d10_d5cc_468f_95a1_b56fde3961b3.slice/crio-e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133 WatchSource:0}: Error finding container e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133: Status 404 returned error can't find the container with id e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.773928 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerStarted","Data":"e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.775314 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"20503c16da8e0a242dea890729372365471b62e53857bc200f7164d82e653bcd"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.775336 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"bbfb98feecd2f0850dad905fc0957730e7d8b0af9c2e5ded6695aebe2f28369f"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.781999 5094 generic.go:334] "Generic (PLEG): container finished" podID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerID="ef3c91c789b3a8c2e2c7b970bccc8b3b862a287138831c82867ac750baa00328" exitCode=0 Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.782353 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerDied","Data":"ef3c91c789b3a8c2e2c7b970bccc8b3b862a287138831c82867ac750baa00328"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.782395 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerStarted","Data":"d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.783895 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.786602 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerStarted","Data":"52b9be5e00a8e14d989754bcec98f6733777f2e861c8ab554e5876f65f7c7c0b"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.792833 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"837d9f622f86b6681941d2ed3daab1dd90f965a8ccdd36d619348d6b31d3cefb"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.792884 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ab92da8ed705d039b49ca77f094f134766be8706ef1bd5528abc5a65acbe03d4"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.800418 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"93ebb1e3c85fe9851f6799dfdd8ac3729027c0f1b0b45968e87380ff7ba3ca22"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.800605 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"38ed73c2ce96afb9435212081fee98c3fa6dc707df7faad283e894a5481c32f9"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.801376 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.816366 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerStarted","Data":"a1dea229adcbee55ddf6e0b41aedbdc8cf35c14a6f68369e5313338e917770ed"} Feb 20 06:48:56 crc kubenswrapper[5094]: I0220 06:48:56.822645 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:48:56 crc kubenswrapper[5094]: W0220 06:48:56.971685 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6b00ff_07fb_4e9a_80da_780c22acbe69.slice/crio-cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a WatchSource:0}: Error finding container cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a: Status 404 returned error can't find the container with id cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.042920 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:57 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:57 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:57 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.043039 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.172549 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.174440 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.176915 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.187550 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.278878 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.279040 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.279080 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.380845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.380973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.381000 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.381967 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.382202 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.401437 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") pod \"redhat-marketplace-jlc84\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.490145 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.577916 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.580767 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.585607 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.684671 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.684779 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.684803 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.786696 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.786775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.786913 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.788621 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.791217 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.798934 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.809975 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") pod \"redhat-marketplace-hw2zj\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: W0220 06:48:57.821998 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c1eecf_1cc2_4480_ac22_99a970f5dc58.slice/crio-b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303 WatchSource:0}: Error finding container b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303: Status 404 returned error can't find the container with id b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303 Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.855842 5094 generic.go:334] "Generic (PLEG): container finished" podID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerID="6eb6b4fa4af198a75121d1f1d8845384553bbedcf18e198cdf00cc8282d9f5b7" exitCode=0 Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.874405 5094 generic.go:334] "Generic (PLEG): container finished" podID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerID="11182af2209155c05ce50ce6f5457662dfc62f8d75b0ebcee01f179c458884f9" exitCode=0 Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.876961 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.877751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerDied","Data":"6eb6b4fa4af198a75121d1f1d8845384553bbedcf18e198cdf00cc8282d9f5b7"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.877787 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerDied","Data":"11182af2209155c05ce50ce6f5457662dfc62f8d75b0ebcee01f179c458884f9"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.880397 5094 generic.go:334] "Generic (PLEG): container finished" podID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerID="8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9" exitCode=0 Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.880482 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerDied","Data":"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.884692 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" event={"ID":"fa6b00ff-07fb-4e9a-80da-780c22acbe69","Type":"ContainerStarted","Data":"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.884767 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" event={"ID":"fa6b00ff-07fb-4e9a-80da-780c22acbe69","Type":"ContainerStarted","Data":"cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a"} Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.899534 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:48:57 crc kubenswrapper[5094]: I0220 06:48:57.977103 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" podStartSLOduration=130.977079948 podStartE2EDuration="2m10.977079948s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:48:57.975978022 +0000 UTC m=+152.848604733" watchObservedRunningTime="2026-02-20 06:48:57.977079948 +0000 UTC m=+152.849706659" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.045104 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.050501 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:58 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:58 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:58 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.050580 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.104534 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.104604 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.120022 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.144932 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.144984 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.166860 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.183969 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.185818 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.188454 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.203880 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.215121 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:48:58 crc kubenswrapper[5094]: W0220 06:48:58.215195 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd307716f_90ad_4b6b_9a49_10f8b5a98721.slice/crio-accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b WatchSource:0}: Error finding container accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b: Status 404 returned error can't find the container with id accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.308598 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.308686 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.308735 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.326773 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.326816 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.327456 5094 patch_prober.go:28] interesting pod/downloads-7954f5f757-d2l2r container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.327485 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-d2l2r" podUID="f1faaf31-f0b6-4828-90cc-51de060dc826" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.410844 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.410904 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.410928 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.412093 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.412310 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.441555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") pod \"redhat-operators-tpwcx\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.511095 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.586362 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.587559 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.605026 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.650312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.650375 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.664726 5094 patch_prober.go:28] interesting pod/console-f9d7485db-shq4j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.664796 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-shq4j" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.715038 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.715111 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.715147 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.816492 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.816574 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.816601 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.817241 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.817886 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.836913 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") pod \"redhat-operators-r5gcc\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.875461 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.877981 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.882579 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.882805 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.888403 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.901773 5094 generic.go:334] "Generic (PLEG): container finished" podID="806ba791-714c-4d13-b595-d4f6ccf06aea" containerID="97bfbe799eec96272d8e9a6b7afb335847d35846686c6b7385ed8fc78aa4aec5" exitCode=0 Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.901834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" event={"ID":"806ba791-714c-4d13-b595-d4f6ccf06aea","Type":"ContainerDied","Data":"97bfbe799eec96272d8e9a6b7afb335847d35846686c6b7385ed8fc78aa4aec5"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.905078 5094 generic.go:334] "Generic (PLEG): container finished" podID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerID="4270db803bcdf262f5c0b9fda9d8278a355396ffba48defc3ee731db9488d8d6" exitCode=0 Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.905382 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerDied","Data":"4270db803bcdf262f5c0b9fda9d8278a355396ffba48defc3ee731db9488d8d6"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.905437 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerStarted","Data":"accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.909240 5094 generic.go:334] "Generic (PLEG): container finished" podID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerID="064b2c1894993da3d72d5837edcc05f078452e781464dc1e2a5cab9234eb9f15" exitCode=0 Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.909601 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerDied","Data":"064b2c1894993da3d72d5837edcc05f078452e781464dc1e2a5cab9234eb9f15"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.911934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerStarted","Data":"b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303"} Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.912406 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.930850 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-p4blk" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.936350 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-bhls2" Feb 20 06:48:58 crc kubenswrapper[5094]: I0220 06:48:58.944180 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.027633 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.027807 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.059870 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:48:59 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:48:59 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:48:59 crc kubenswrapper[5094]: healthz check failed Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.059932 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.093242 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.131848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.131937 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.132314 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: W0220 06:48:59.158316 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ecc3e73_dd76_4a73_a366_92c78aca386e.slice/crio-0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288 WatchSource:0}: Error finding container 0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288: Status 404 returned error can't find the container with id 0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288 Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.169496 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.202202 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.502450 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:48:59 crc kubenswrapper[5094]: W0220 06:48:59.600767 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4803c5cf_27e3_414a_8fa4_7e82730e311d.slice/crio-22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5 WatchSource:0}: Error finding container 22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5: Status 404 returned error can't find the container with id 22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5 Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.835617 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 20 06:48:59 crc kubenswrapper[5094]: W0220 06:48:59.861323 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podecd4e39f_130b_4f63_aedb_6cda9ec7da80.slice/crio-42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced WatchSource:0}: Error finding container 42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced: Status 404 returned error can't find the container with id 42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.951203 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerID="30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222" exitCode=0 Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.951411 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerDied","Data":"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222"} Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.951442 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerStarted","Data":"0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288"} Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.955490 5094 generic.go:334] "Generic (PLEG): container finished" podID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerID="7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805" exitCode=0 Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.955543 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerDied","Data":"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805"} Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.955564 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerStarted","Data":"22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5"} Feb 20 06:48:59 crc kubenswrapper[5094]: I0220 06:48:59.982296 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ecd4e39f-130b-4f63-aedb-6cda9ec7da80","Type":"ContainerStarted","Data":"42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced"} Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.043027 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:00 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:00 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:00 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.043117 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.445659 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.574114 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") pod \"806ba791-714c-4d13-b595-d4f6ccf06aea\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.574437 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") pod \"806ba791-714c-4d13-b595-d4f6ccf06aea\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.574516 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") pod \"806ba791-714c-4d13-b595-d4f6ccf06aea\" (UID: \"806ba791-714c-4d13-b595-d4f6ccf06aea\") " Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.575407 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume" (OuterVolumeSpecName: "config-volume") pod "806ba791-714c-4d13-b595-d4f6ccf06aea" (UID: "806ba791-714c-4d13-b595-d4f6ccf06aea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.584327 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "806ba791-714c-4d13-b595-d4f6ccf06aea" (UID: "806ba791-714c-4d13-b595-d4f6ccf06aea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.602990 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r" (OuterVolumeSpecName: "kube-api-access-j498r") pod "806ba791-714c-4d13-b595-d4f6ccf06aea" (UID: "806ba791-714c-4d13-b595-d4f6ccf06aea"). InnerVolumeSpecName "kube-api-access-j498r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.677133 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/806ba791-714c-4d13-b595-d4f6ccf06aea-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.677208 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/806ba791-714c-4d13-b595-d4f6ccf06aea-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.677222 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j498r\" (UniqueName: \"kubernetes.io/projected/806ba791-714c-4d13-b595-d4f6ccf06aea-kube-api-access-j498r\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.723827 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 06:49:00 crc kubenswrapper[5094]: E0220 06:49:00.724194 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="806ba791-714c-4d13-b595-d4f6ccf06aea" containerName="collect-profiles" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.724232 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="806ba791-714c-4d13-b595-d4f6ccf06aea" containerName="collect-profiles" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.724401 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="806ba791-714c-4d13-b595-d4f6ccf06aea" containerName="collect-profiles" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.727352 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.730640 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.730909 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.731303 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.779390 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.779466 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.881361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.881420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.881527 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:00 crc kubenswrapper[5094]: I0220 06:49:00.900544 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.006677 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" event={"ID":"806ba791-714c-4d13-b595-d4f6ccf06aea","Type":"ContainerDied","Data":"2cba69ee98eb39e06e5e23c24335f67d934d7ddd307c2f258f0da6b72887c796"} Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.006758 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cba69ee98eb39e06e5e23c24335f67d934d7ddd307c2f258f0da6b72887c796" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.006834 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.013197 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ecd4e39f-130b-4f63-aedb-6cda9ec7da80","Type":"ContainerStarted","Data":"eac6d043a26a977dd2c97385c1aee97e96990a378d70f7183c056dc53f6aea96"} Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.032642 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.032622099 podStartE2EDuration="3.032622099s" podCreationTimestamp="2026-02-20 06:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:49:01.028894601 +0000 UTC m=+155.901521312" watchObservedRunningTime="2026-02-20 06:49:01.032622099 +0000 UTC m=+155.905248810" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.043496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.044314 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:01 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:01 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:01 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.044359 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:01 crc kubenswrapper[5094]: I0220 06:49:01.727116 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 20 06:49:01 crc kubenswrapper[5094]: W0220 06:49:01.772015 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8b3e1407_37b6_4b4b_ac69_7f5acdcac274.slice/crio-1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4 WatchSource:0}: Error finding container 1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4: Status 404 returned error can't find the container with id 1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4 Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.030063 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b3e1407-37b6-4b4b-ac69-7f5acdcac274","Type":"ContainerStarted","Data":"1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4"} Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.048094 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:02 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:02 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:02 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.048172 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.048425 5094 generic.go:334] "Generic (PLEG): container finished" podID="ecd4e39f-130b-4f63-aedb-6cda9ec7da80" containerID="eac6d043a26a977dd2c97385c1aee97e96990a378d70f7183c056dc53f6aea96" exitCode=0 Feb 20 06:49:02 crc kubenswrapper[5094]: I0220 06:49:02.048469 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ecd4e39f-130b-4f63-aedb-6cda9ec7da80","Type":"ContainerDied","Data":"eac6d043a26a977dd2c97385c1aee97e96990a378d70f7183c056dc53f6aea96"} Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.042807 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:03 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:03 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:03 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.042888 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.060095 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b3e1407-37b6-4b4b-ac69-7f5acdcac274","Type":"ContainerStarted","Data":"3d1815b1d3749a263d6e1a9f306addfab6cde409eabb2876a93885391fc3aaf6"} Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.429140 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.460802 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.460775655 podStartE2EDuration="3.460775655s" podCreationTimestamp="2026-02-20 06:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:49:03.081805422 +0000 UTC m=+157.954432133" watchObservedRunningTime="2026-02-20 06:49:03.460775655 +0000 UTC m=+158.333402366" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.561126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") pod \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.561280 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ecd4e39f-130b-4f63-aedb-6cda9ec7da80" (UID: "ecd4e39f-130b-4f63-aedb-6cda9ec7da80"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.561350 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") pod \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\" (UID: \"ecd4e39f-130b-4f63-aedb-6cda9ec7da80\") " Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.561742 5094 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.570045 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ecd4e39f-130b-4f63-aedb-6cda9ec7da80" (UID: "ecd4e39f-130b-4f63-aedb-6cda9ec7da80"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:03 crc kubenswrapper[5094]: I0220 06:49:03.663806 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd4e39f-130b-4f63-aedb-6cda9ec7da80-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.043728 5094 patch_prober.go:28] interesting pod/router-default-5444994796-9sztr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 20 06:49:04 crc kubenswrapper[5094]: [-]has-synced failed: reason withheld Feb 20 06:49:04 crc kubenswrapper[5094]: [+]process-running ok Feb 20 06:49:04 crc kubenswrapper[5094]: healthz check failed Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.044139 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9sztr" podUID="a6908ab3-3d33-4e31-b226-b6607f34ee8b" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.073395 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l2hxn" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.084878 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"ecd4e39f-130b-4f63-aedb-6cda9ec7da80","Type":"ContainerDied","Data":"42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced"} Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.084956 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42b38de35ebf1c38f39a8ad94f8897c2ba7db2a7bd42d6f1e604912889bd4ced" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.085033 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.103321 5094 generic.go:334] "Generic (PLEG): container finished" podID="8b3e1407-37b6-4b4b-ac69-7f5acdcac274" containerID="3d1815b1d3749a263d6e1a9f306addfab6cde409eabb2876a93885391fc3aaf6" exitCode=0 Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.104069 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b3e1407-37b6-4b4b-ac69-7f5acdcac274","Type":"ContainerDied","Data":"3d1815b1d3749a263d6e1a9f306addfab6cde409eabb2876a93885391fc3aaf6"} Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.107009 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:49:04 crc kubenswrapper[5094]: I0220 06:49:04.107058 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:49:05 crc kubenswrapper[5094]: I0220 06:49:05.043523 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:49:05 crc kubenswrapper[5094]: I0220 06:49:05.055037 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9sztr" Feb 20 06:49:08 crc kubenswrapper[5094]: I0220 06:49:08.340372 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-d2l2r" Feb 20 06:49:08 crc kubenswrapper[5094]: I0220 06:49:08.649388 5094 patch_prober.go:28] interesting pod/console-f9d7485db-shq4j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 20 06:49:08 crc kubenswrapper[5094]: I0220 06:49:08.649466 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-shq4j" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 20 06:49:09 crc kubenswrapper[5094]: I0220 06:49:09.780401 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:49:09 crc kubenswrapper[5094]: I0220 06:49:09.794016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da0aa093-1adc-45f2-a942-e68d7be23ed4-metrics-certs\") pod \"network-metrics-daemon-8ww4n\" (UID: \"da0aa093-1adc-45f2-a942-e68d7be23ed4\") " pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:49:09 crc kubenswrapper[5094]: I0220 06:49:09.864877 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8ww4n" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.030705 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.203738 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") pod \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.203802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") pod \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\" (UID: \"8b3e1407-37b6-4b4b-ac69-7f5acdcac274\") " Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.204102 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8b3e1407-37b6-4b4b-ac69-7f5acdcac274" (UID: "8b3e1407-37b6-4b4b-ac69-7f5acdcac274"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.218876 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8b3e1407-37b6-4b4b-ac69-7f5acdcac274" (UID: "8b3e1407-37b6-4b4b-ac69-7f5acdcac274"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.229894 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8b3e1407-37b6-4b4b-ac69-7f5acdcac274","Type":"ContainerDied","Data":"1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4"} Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.229947 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f9f5ab04dc82fef1a825f24f0f0a468f7a92d6aa4dd5c0f6d0734ed6b6c45e4" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.229948 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.305651 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.305720 5094 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b3e1407-37b6-4b4b-ac69-7f5acdcac274-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:16 crc kubenswrapper[5094]: I0220 06:49:16.466610 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:49:18 crc kubenswrapper[5094]: I0220 06:49:18.655565 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:49:18 crc kubenswrapper[5094]: I0220 06:49:18.660654 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.148781 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.149901 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gmmb7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-85kbx_openshift-marketplace(4aa46aec-af59-49c6-9ff5-a08df3d68e5b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.151827 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-85kbx" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.215097 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.215820 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxwcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7zj2v_openshift-marketplace(66c35d10-d5cc-468f-95a1-b56fde3961b3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.217195 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7zj2v" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.241225 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.241394 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2f7t5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-c94fj_openshift-marketplace(88e94523-c126-4ce8-a6c7-2f83eb91d3fc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.243163 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-c94fj" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.307348 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-85kbx" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.307608 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-c94fj" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" Feb 20 06:49:28 crc kubenswrapper[5094]: E0220 06:49:28.316038 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7zj2v" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" Feb 20 06:49:28 crc kubenswrapper[5094]: I0220 06:49:28.540416 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8ww4n"] Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.018827 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rm75q" Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.310345 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerID="94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799" exitCode=0 Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.310445 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerDied","Data":"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.324334 5094 generic.go:334] "Generic (PLEG): container finished" podID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerID="ca9993542532855c09ba40fb79d1b2ff1916ab7e330faa07724168697397276c" exitCode=0 Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.324420 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerDied","Data":"ca9993542532855c09ba40fb79d1b2ff1916ab7e330faa07724168697397276c"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.335088 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerStarted","Data":"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.337327 5094 generic.go:334] "Generic (PLEG): container finished" podID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerID="a98e2e4e640eec309b5e0629faade0edf22365e6a1089be8c0022fcfa2fa99aa" exitCode=0 Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.337414 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerDied","Data":"a98e2e4e640eec309b5e0629faade0edf22365e6a1089be8c0022fcfa2fa99aa"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.344665 5094 generic.go:334] "Generic (PLEG): container finished" podID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerID="bffc4090bcdc589daef83a404f87b0aa9489a6b07489e331199c26107f2625bc" exitCode=0 Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.344754 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerDied","Data":"bffc4090bcdc589daef83a404f87b0aa9489a6b07489e331199c26107f2625bc"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.346688 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" event={"ID":"da0aa093-1adc-45f2-a942-e68d7be23ed4","Type":"ContainerStarted","Data":"d85aaa7493881a069ae0b83502388eaef5721562cb786bfb34d14ba2e81caced"} Feb 20 06:49:29 crc kubenswrapper[5094]: I0220 06:49:29.346747 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" event={"ID":"da0aa093-1adc-45f2-a942-e68d7be23ed4","Type":"ContainerStarted","Data":"9753f8c6e0e76ae1629d27e1385daf8a5fb26b4937299a3ce52233b105ee6329"} Feb 20 06:49:30 crc kubenswrapper[5094]: I0220 06:49:30.357077 5094 generic.go:334] "Generic (PLEG): container finished" podID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerID="6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be" exitCode=0 Feb 20 06:49:30 crc kubenswrapper[5094]: I0220 06:49:30.357155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerDied","Data":"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be"} Feb 20 06:49:30 crc kubenswrapper[5094]: I0220 06:49:30.360532 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8ww4n" event={"ID":"da0aa093-1adc-45f2-a942-e68d7be23ed4","Type":"ContainerStarted","Data":"29f4d19d0726c8d6248c1e530378b460824c6a6903ba1d1131570c283b8ce0cf"} Feb 20 06:49:31 crc kubenswrapper[5094]: I0220 06:49:31.404096 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8ww4n" podStartSLOduration=164.404075667 podStartE2EDuration="2m44.404075667s" podCreationTimestamp="2026-02-20 06:46:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:49:31.400211395 +0000 UTC m=+186.272838106" watchObservedRunningTime="2026-02-20 06:49:31.404075667 +0000 UTC m=+186.276702378" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.392627 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerStarted","Data":"b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.396391 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerStarted","Data":"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.398421 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerStarted","Data":"289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.400285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerStarted","Data":"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.402361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerStarted","Data":"7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0"} Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.417147 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d5dnl" podStartSLOduration=2.68878911 podStartE2EDuration="38.417127972s" podCreationTimestamp="2026-02-20 06:48:55 +0000 UTC" firstStartedPulling="2026-02-20 06:48:56.783564766 +0000 UTC m=+151.656191477" lastFinishedPulling="2026-02-20 06:49:32.511903628 +0000 UTC m=+187.384530339" observedRunningTime="2026-02-20 06:49:33.415252898 +0000 UTC m=+188.287879629" watchObservedRunningTime="2026-02-20 06:49:33.417127972 +0000 UTC m=+188.289754683" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.438123 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hw2zj" podStartSLOduration=2.725728263 podStartE2EDuration="36.438103719s" podCreationTimestamp="2026-02-20 06:48:57 +0000 UTC" firstStartedPulling="2026-02-20 06:48:58.907918578 +0000 UTC m=+153.780545279" lastFinishedPulling="2026-02-20 06:49:32.620294004 +0000 UTC m=+187.492920735" observedRunningTime="2026-02-20 06:49:33.435133629 +0000 UTC m=+188.307760340" watchObservedRunningTime="2026-02-20 06:49:33.438103719 +0000 UTC m=+188.310730430" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.478681 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tpwcx" podStartSLOduration=3.215710933 podStartE2EDuration="35.478657159s" podCreationTimestamp="2026-02-20 06:48:58 +0000 UTC" firstStartedPulling="2026-02-20 06:48:59.963009162 +0000 UTC m=+154.835635873" lastFinishedPulling="2026-02-20 06:49:32.225955388 +0000 UTC m=+187.098582099" observedRunningTime="2026-02-20 06:49:33.457813106 +0000 UTC m=+188.330439817" watchObservedRunningTime="2026-02-20 06:49:33.478657159 +0000 UTC m=+188.351283870" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.479740 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r5gcc" podStartSLOduration=2.71596603 podStartE2EDuration="35.479732135s" podCreationTimestamp="2026-02-20 06:48:58 +0000 UTC" firstStartedPulling="2026-02-20 06:48:59.963633956 +0000 UTC m=+154.836260667" lastFinishedPulling="2026-02-20 06:49:32.727400061 +0000 UTC m=+187.600026772" observedRunningTime="2026-02-20 06:49:33.476642162 +0000 UTC m=+188.349268873" watchObservedRunningTime="2026-02-20 06:49:33.479732135 +0000 UTC m=+188.352358846" Feb 20 06:49:33 crc kubenswrapper[5094]: I0220 06:49:33.495354 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jlc84" podStartSLOduration=2.895338729 podStartE2EDuration="36.495309154s" podCreationTimestamp="2026-02-20 06:48:57 +0000 UTC" firstStartedPulling="2026-02-20 06:48:58.930078773 +0000 UTC m=+153.802705484" lastFinishedPulling="2026-02-20 06:49:32.530049208 +0000 UTC m=+187.402675909" observedRunningTime="2026-02-20 06:49:33.494147376 +0000 UTC m=+188.366774087" watchObservedRunningTime="2026-02-20 06:49:33.495309154 +0000 UTC m=+188.367935865" Feb 20 06:49:34 crc kubenswrapper[5094]: I0220 06:49:34.107155 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:49:34 crc kubenswrapper[5094]: I0220 06:49:34.107233 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:49:35 crc kubenswrapper[5094]: I0220 06:49:35.211286 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 20 06:49:35 crc kubenswrapper[5094]: I0220 06:49:35.667921 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:35 crc kubenswrapper[5094]: I0220 06:49:35.668011 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:35 crc kubenswrapper[5094]: I0220 06:49:35.811925 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:36 crc kubenswrapper[5094]: I0220 06:49:36.456021 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.469468 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.490898 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.491917 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.557988 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.776619 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.900351 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.900424 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:37 crc kubenswrapper[5094]: I0220 06:49:37.939564 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.472937 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.482722 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.511903 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.512264 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.945190 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:38 crc kubenswrapper[5094]: I0220 06:49:38.945684 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.119648 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 06:49:39 crc kubenswrapper[5094]: E0220 06:49:39.119912 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3e1407-37b6-4b4b-ac69-7f5acdcac274" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.119925 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3e1407-37b6-4b4b-ac69-7f5acdcac274" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: E0220 06:49:39.119945 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd4e39f-130b-4f63-aedb-6cda9ec7da80" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.119950 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd4e39f-130b-4f63-aedb-6cda9ec7da80" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.120043 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd4e39f-130b-4f63-aedb-6cda9ec7da80" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.120062 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3e1407-37b6-4b4b-ac69-7f5acdcac274" containerName="pruner" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.120452 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.127231 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.127482 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.132140 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.191122 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.191235 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.292422 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.292772 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.292610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.338994 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.439881 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d5dnl" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="registry-server" containerID="cri-o://b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d" gracePeriod=2 Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.440255 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.577868 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tpwcx" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" probeResult="failure" output=< Feb 20 06:49:39 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 06:49:39 crc kubenswrapper[5094]: > Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.913498 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 20 06:49:39 crc kubenswrapper[5094]: W0220 06:49:39.935814 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4e4b262a_e29c_492e_aba7_4d09a33ba01f.slice/crio-868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831 WatchSource:0}: Error finding container 868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831: Status 404 returned error can't find the container with id 868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831 Feb 20 06:49:39 crc kubenswrapper[5094]: I0220 06:49:39.978106 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.005171 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r5gcc" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" probeResult="failure" output=< Feb 20 06:49:40 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 06:49:40 crc kubenswrapper[5094]: > Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.450049 5094 generic.go:334] "Generic (PLEG): container finished" podID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerID="b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d" exitCode=0 Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.450133 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerDied","Data":"b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d"} Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.453077 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e4b262a-e29c-492e-aba7-4d09a33ba01f","Type":"ContainerStarted","Data":"868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831"} Feb 20 06:49:40 crc kubenswrapper[5094]: I0220 06:49:40.453417 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hw2zj" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="registry-server" containerID="cri-o://7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0" gracePeriod=2 Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.341636 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.463310 5094 generic.go:334] "Generic (PLEG): container finished" podID="4e4b262a-e29c-492e-aba7-4d09a33ba01f" containerID="726b2f542d8a22099222541b412c4499abb5b4cb409d678a29dba6f8fa1aff1a" exitCode=0 Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.464257 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e4b262a-e29c-492e-aba7-4d09a33ba01f","Type":"ContainerDied","Data":"726b2f542d8a22099222541b412c4499abb5b4cb409d678a29dba6f8fa1aff1a"} Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.470215 5094 generic.go:334] "Generic (PLEG): container finished" podID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerID="7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0" exitCode=0 Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.470299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerDied","Data":"7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0"} Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.472457 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d5dnl" event={"ID":"94d239c4-71e0-42e2-a2e7-69acd87b5986","Type":"ContainerDied","Data":"d62690bad3f55dfb0952f61bd6ad5dc34171ea69ec48d54c64252d6633321266"} Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.472500 5094 scope.go:117] "RemoveContainer" containerID="b6bb06ccb30a2b9b8fc736c8aac43c98de0a34a475734de5f7f10809e745e22d" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.472640 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d5dnl" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.503262 5094 scope.go:117] "RemoveContainer" containerID="bffc4090bcdc589daef83a404f87b0aa9489a6b07489e331199c26107f2625bc" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.522364 5094 scope.go:117] "RemoveContainer" containerID="ef3c91c789b3a8c2e2c7b970bccc8b3b862a287138831c82867ac750baa00328" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.525692 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") pod \"94d239c4-71e0-42e2-a2e7-69acd87b5986\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.525813 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") pod \"94d239c4-71e0-42e2-a2e7-69acd87b5986\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.525981 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") pod \"94d239c4-71e0-42e2-a2e7-69acd87b5986\" (UID: \"94d239c4-71e0-42e2-a2e7-69acd87b5986\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.526647 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities" (OuterVolumeSpecName: "utilities") pod "94d239c4-71e0-42e2-a2e7-69acd87b5986" (UID: "94d239c4-71e0-42e2-a2e7-69acd87b5986"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.532854 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c" (OuterVolumeSpecName: "kube-api-access-dk49c") pod "94d239c4-71e0-42e2-a2e7-69acd87b5986" (UID: "94d239c4-71e0-42e2-a2e7-69acd87b5986"). InnerVolumeSpecName "kube-api-access-dk49c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.584192 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94d239c4-71e0-42e2-a2e7-69acd87b5986" (UID: "94d239c4-71e0-42e2-a2e7-69acd87b5986"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.601386 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.627934 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") pod \"d307716f-90ad-4b6b-9a49-10f8b5a98721\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628004 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") pod \"d307716f-90ad-4b6b-9a49-10f8b5a98721\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628046 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") pod \"d307716f-90ad-4b6b-9a49-10f8b5a98721\" (UID: \"d307716f-90ad-4b6b-9a49-10f8b5a98721\") " Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628217 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628230 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94d239c4-71e0-42e2-a2e7-69acd87b5986-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.628242 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk49c\" (UniqueName: \"kubernetes.io/projected/94d239c4-71e0-42e2-a2e7-69acd87b5986-kube-api-access-dk49c\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.629130 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities" (OuterVolumeSpecName: "utilities") pod "d307716f-90ad-4b6b-9a49-10f8b5a98721" (UID: "d307716f-90ad-4b6b-9a49-10f8b5a98721"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.632522 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq" (OuterVolumeSpecName: "kube-api-access-985sq") pod "d307716f-90ad-4b6b-9a49-10f8b5a98721" (UID: "d307716f-90ad-4b6b-9a49-10f8b5a98721"). InnerVolumeSpecName "kube-api-access-985sq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.651399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d307716f-90ad-4b6b-9a49-10f8b5a98721" (UID: "d307716f-90ad-4b6b-9a49-10f8b5a98721"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.729467 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.729507 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d307716f-90ad-4b6b-9a49-10f8b5a98721-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.729519 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-985sq\" (UniqueName: \"kubernetes.io/projected/d307716f-90ad-4b6b-9a49-10f8b5a98721-kube-api-access-985sq\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.832020 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.838303 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d5dnl"] Feb 20 06:49:41 crc kubenswrapper[5094]: I0220 06:49:41.856329 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" path="/var/lib/kubelet/pods/94d239c4-71e0-42e2-a2e7-69acd87b5986/volumes" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.483771 5094 generic.go:334] "Generic (PLEG): container finished" podID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerID="a3085de07d8490c70b05f62d546c4844150be55db1f8b370f140f0fcadcb36da" exitCode=0 Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.483817 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerDied","Data":"a3085de07d8490c70b05f62d546c4844150be55db1f8b370f140f0fcadcb36da"} Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.488181 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hw2zj" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.488558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hw2zj" event={"ID":"d307716f-90ad-4b6b-9a49-10f8b5a98721","Type":"ContainerDied","Data":"accfce42132240eac7ff5d1f0c619115e032c77085b53887db760a64c361f25b"} Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.488584 5094 scope.go:117] "RemoveContainer" containerID="7a77c43df08e65efbe5f0ef6e087595759d1fd5a8dc86fd7a2d0e287b0015be0" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.524325 5094 scope.go:117] "RemoveContainer" containerID="a98e2e4e640eec309b5e0629faade0edf22365e6a1089be8c0022fcfa2fa99aa" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.535565 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.543320 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hw2zj"] Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.558259 5094 scope.go:117] "RemoveContainer" containerID="4270db803bcdf262f5c0b9fda9d8278a355396ffba48defc3ee731db9488d8d6" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.842983 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.948203 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") pod \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.948251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") pod \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\" (UID: \"4e4b262a-e29c-492e-aba7-4d09a33ba01f\") " Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.948331 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4e4b262a-e29c-492e-aba7-4d09a33ba01f" (UID: "4e4b262a-e29c-492e-aba7-4d09a33ba01f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.948781 5094 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:42 crc kubenswrapper[5094]: I0220 06:49:42.952563 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4e4b262a-e29c-492e-aba7-4d09a33ba01f" (UID: "4e4b262a-e29c-492e-aba7-4d09a33ba01f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.050038 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4e4b262a-e29c-492e-aba7-4d09a33ba01f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.503175 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerStarted","Data":"0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836"} Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.508125 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4e4b262a-e29c-492e-aba7-4d09a33ba01f","Type":"ContainerDied","Data":"868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831"} Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.508180 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="868e46802e379785aa9f33c6c762d513a40e3089c4411608729ba82707a26831" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.508252 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.539636 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c94fj" podStartSLOduration=3.545939544 podStartE2EDuration="48.539604517s" podCreationTimestamp="2026-02-20 06:48:55 +0000 UTC" firstStartedPulling="2026-02-20 06:48:57.874681863 +0000 UTC m=+152.747308594" lastFinishedPulling="2026-02-20 06:49:42.868346846 +0000 UTC m=+197.740973567" observedRunningTime="2026-02-20 06:49:43.538220994 +0000 UTC m=+198.410847735" watchObservedRunningTime="2026-02-20 06:49:43.539604517 +0000 UTC m=+198.412231268" Feb 20 06:49:43 crc kubenswrapper[5094]: I0220 06:49:43.852330 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" path="/var/lib/kubelet/pods/d307716f-90ad-4b6b-9a49-10f8b5a98721/volumes" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.514795 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.515410 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.519809 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520175 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="extract-utilities" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520269 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="extract-utilities" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520352 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="extract-content" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520428 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="extract-content" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520508 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e4b262a-e29c-492e-aba7-4d09a33ba01f" containerName="pruner" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520584 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e4b262a-e29c-492e-aba7-4d09a33ba01f" containerName="pruner" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520663 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="extract-utilities" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520757 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="extract-utilities" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.520853 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.520946 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.521044 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="extract-content" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521121 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="extract-content" Feb 20 06:49:45 crc kubenswrapper[5094]: E0220 06:49:45.521201 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521281 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521506 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e4b262a-e29c-492e-aba7-4d09a33ba01f" containerName="pruner" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521606 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d239c4-71e0-42e2-a2e7-69acd87b5986" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.521691 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d307716f-90ad-4b6b-9a49-10f8b5a98721" containerName="registry-server" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.522260 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.525614 5094 generic.go:334] "Generic (PLEG): container finished" podID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerID="61352ad384c7169a9a29e90c914460822eb2dc45803cccf5cac7d1c7d42a40b1" exitCode=0 Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.525762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerDied","Data":"61352ad384c7169a9a29e90c914460822eb2dc45803cccf5cac7d1c7d42a40b1"} Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.529126 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.530241 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.540025 5094 generic.go:334] "Generic (PLEG): container finished" podID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerID="a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025" exitCode=0 Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.540095 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerDied","Data":"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025"} Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.540871 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.596446 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.690726 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.690861 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.690889 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.791953 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.792010 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.792077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.792073 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.792183 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.813569 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") pod \"installer-9-crc\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:45 crc kubenswrapper[5094]: I0220 06:49:45.864669 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.401060 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 20 06:49:46 crc kubenswrapper[5094]: W0220 06:49:46.414159 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6d73a928_b634_44c7_a3ca_8ffc9a40277e.slice/crio-86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f WatchSource:0}: Error finding container 86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f: Status 404 returned error can't find the container with id 86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.547963 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6d73a928-b634-44c7-a3ca-8ffc9a40277e","Type":"ContainerStarted","Data":"86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f"} Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.552130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerStarted","Data":"beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4"} Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.559747 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerStarted","Data":"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53"} Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.579829 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zj2v" podStartSLOduration=4.524458684 podStartE2EDuration="52.579803331s" podCreationTimestamp="2026-02-20 06:48:54 +0000 UTC" firstStartedPulling="2026-02-20 06:48:57.879552708 +0000 UTC m=+152.752179429" lastFinishedPulling="2026-02-20 06:49:45.934897325 +0000 UTC m=+200.807524076" observedRunningTime="2026-02-20 06:49:46.57208989 +0000 UTC m=+201.444716601" watchObservedRunningTime="2026-02-20 06:49:46.579803331 +0000 UTC m=+201.452430062" Feb 20 06:49:46 crc kubenswrapper[5094]: I0220 06:49:46.595207 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85kbx" podStartSLOduration=3.455875063 podStartE2EDuration="51.595178265s" podCreationTimestamp="2026-02-20 06:48:55 +0000 UTC" firstStartedPulling="2026-02-20 06:48:57.884175638 +0000 UTC m=+152.756802349" lastFinishedPulling="2026-02-20 06:49:46.02347882 +0000 UTC m=+200.896105551" observedRunningTime="2026-02-20 06:49:46.593209959 +0000 UTC m=+201.465836680" watchObservedRunningTime="2026-02-20 06:49:46.595178265 +0000 UTC m=+201.467804986" Feb 20 06:49:47 crc kubenswrapper[5094]: I0220 06:49:47.567317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6d73a928-b634-44c7-a3ca-8ffc9a40277e","Type":"ContainerStarted","Data":"c577f6af9ab3888d7eaafd2ffe5bc4c1228acf1f9ea1fd93255848a9f2a96cbc"} Feb 20 06:49:47 crc kubenswrapper[5094]: I0220 06:49:47.586902 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.5868788780000003 podStartE2EDuration="2.586878878s" podCreationTimestamp="2026-02-20 06:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:49:47.58275918 +0000 UTC m=+202.455385891" watchObservedRunningTime="2026-02-20 06:49:47.586878878 +0000 UTC m=+202.459505589" Feb 20 06:49:48 crc kubenswrapper[5094]: I0220 06:49:48.580257 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:49:48 crc kubenswrapper[5094]: I0220 06:49:48.664216 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:49:49 crc kubenswrapper[5094]: I0220 06:49:49.001588 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:49 crc kubenswrapper[5094]: I0220 06:49:49.079370 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:52 crc kubenswrapper[5094]: I0220 06:49:52.584054 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:49:52 crc kubenswrapper[5094]: I0220 06:49:52.584882 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r5gcc" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" containerID="cri-o://51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" gracePeriod=2 Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.091827 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.223026 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") pod \"4803c5cf-27e3-414a-8fa4-7e82730e311d\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.223199 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") pod \"4803c5cf-27e3-414a-8fa4-7e82730e311d\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.224767 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities" (OuterVolumeSpecName: "utilities") pod "4803c5cf-27e3-414a-8fa4-7e82730e311d" (UID: "4803c5cf-27e3-414a-8fa4-7e82730e311d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.225797 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") pod \"4803c5cf-27e3-414a-8fa4-7e82730e311d\" (UID: \"4803c5cf-27e3-414a-8fa4-7e82730e311d\") " Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.226261 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.230328 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb" (OuterVolumeSpecName: "kube-api-access-vkfpb") pod "4803c5cf-27e3-414a-8fa4-7e82730e311d" (UID: "4803c5cf-27e3-414a-8fa4-7e82730e311d"). InnerVolumeSpecName "kube-api-access-vkfpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.327845 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkfpb\" (UniqueName: \"kubernetes.io/projected/4803c5cf-27e3-414a-8fa4-7e82730e311d-kube-api-access-vkfpb\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.391116 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4803c5cf-27e3-414a-8fa4-7e82730e311d" (UID: "4803c5cf-27e3-414a-8fa4-7e82730e311d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.429190 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4803c5cf-27e3-414a-8fa4-7e82730e311d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616822 5094 generic.go:334] "Generic (PLEG): container finished" podID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerID="51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" exitCode=0 Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616903 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerDied","Data":"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb"} Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616948 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r5gcc" event={"ID":"4803c5cf-27e3-414a-8fa4-7e82730e311d","Type":"ContainerDied","Data":"22aca6363f2206b11762898817ccc1128807f675094b438a82160e56aa1326d5"} Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616981 5094 scope.go:117] "RemoveContainer" containerID="51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.616988 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r5gcc" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.669558 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.672608 5094 scope.go:117] "RemoveContainer" containerID="6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.677948 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r5gcc"] Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.705177 5094 scope.go:117] "RemoveContainer" containerID="7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.728804 5094 scope.go:117] "RemoveContainer" containerID="51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" Feb 20 06:49:53 crc kubenswrapper[5094]: E0220 06:49:53.731236 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb\": container with ID starting with 51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb not found: ID does not exist" containerID="51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.731341 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb"} err="failed to get container status \"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb\": rpc error: code = NotFound desc = could not find container \"51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb\": container with ID starting with 51d312f52e87068fa0bc859138dfee26f32d3cf50be3717bdff37c98bea619cb not found: ID does not exist" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.731469 5094 scope.go:117] "RemoveContainer" containerID="6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be" Feb 20 06:49:53 crc kubenswrapper[5094]: E0220 06:49:53.732271 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be\": container with ID starting with 6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be not found: ID does not exist" containerID="6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.732341 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be"} err="failed to get container status \"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be\": rpc error: code = NotFound desc = could not find container \"6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be\": container with ID starting with 6f5b6439f6f44bc36beab797da0a74ac28cd5d6755ba7368790483f6fcf685be not found: ID does not exist" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.732392 5094 scope.go:117] "RemoveContainer" containerID="7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805" Feb 20 06:49:53 crc kubenswrapper[5094]: E0220 06:49:53.732885 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805\": container with ID starting with 7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805 not found: ID does not exist" containerID="7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.732964 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805"} err="failed to get container status \"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805\": rpc error: code = NotFound desc = could not find container \"7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805\": container with ID starting with 7f79938926dc3a0a2752cae1f32aa8708d69b23b4db2a3c1830917ae04282805 not found: ID does not exist" Feb 20 06:49:53 crc kubenswrapper[5094]: I0220 06:49:53.854628 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" path="/var/lib/kubelet/pods/4803c5cf-27e3-414a-8fa4-7e82730e311d/volumes" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.340038 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.340101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.419494 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.581059 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.695602 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.996742 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:55 crc kubenswrapper[5094]: I0220 06:49:55.996834 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:56 crc kubenswrapper[5094]: I0220 06:49:56.068291 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:56 crc kubenswrapper[5094]: I0220 06:49:56.707251 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:57 crc kubenswrapper[5094]: I0220 06:49:57.980564 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:49:58 crc kubenswrapper[5094]: I0220 06:49:58.655536 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-85kbx" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="registry-server" containerID="cri-o://885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" gracePeriod=2 Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.070176 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.219310 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") pod \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.219585 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") pod \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.219655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") pod \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\" (UID: \"4aa46aec-af59-49c6-9ff5-a08df3d68e5b\") " Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.220424 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities" (OuterVolumeSpecName: "utilities") pod "4aa46aec-af59-49c6-9ff5-a08df3d68e5b" (UID: "4aa46aec-af59-49c6-9ff5-a08df3d68e5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.220807 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.233176 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7" (OuterVolumeSpecName: "kube-api-access-gmmb7") pod "4aa46aec-af59-49c6-9ff5-a08df3d68e5b" (UID: "4aa46aec-af59-49c6-9ff5-a08df3d68e5b"). InnerVolumeSpecName "kube-api-access-gmmb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.275914 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aa46aec-af59-49c6-9ff5-a08df3d68e5b" (UID: "4aa46aec-af59-49c6-9ff5-a08df3d68e5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.322566 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.322796 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmmb7\" (UniqueName: \"kubernetes.io/projected/4aa46aec-af59-49c6-9ff5-a08df3d68e5b-kube-api-access-gmmb7\") on node \"crc\" DevicePath \"\"" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665095 5094 generic.go:334] "Generic (PLEG): container finished" podID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerID="885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" exitCode=0 Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665393 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerDied","Data":"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53"} Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665482 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85kbx" event={"ID":"4aa46aec-af59-49c6-9ff5-a08df3d68e5b","Type":"ContainerDied","Data":"52b9be5e00a8e14d989754bcec98f6733777f2e861c8ab554e5876f65f7c7c0b"} Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665588 5094 scope.go:117] "RemoveContainer" containerID="885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.665829 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85kbx" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.705033 5094 scope.go:117] "RemoveContainer" containerID="a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.720523 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.728231 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-85kbx"] Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.735922 5094 scope.go:117] "RemoveContainer" containerID="8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.773672 5094 scope.go:117] "RemoveContainer" containerID="885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" Feb 20 06:49:59 crc kubenswrapper[5094]: E0220 06:49:59.775637 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53\": container with ID starting with 885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53 not found: ID does not exist" containerID="885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.775699 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53"} err="failed to get container status \"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53\": rpc error: code = NotFound desc = could not find container \"885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53\": container with ID starting with 885427d878899a96fb10ec45e403743858b214cabb65ddef410990fe2fdd6e53 not found: ID does not exist" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.775977 5094 scope.go:117] "RemoveContainer" containerID="a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025" Feb 20 06:49:59 crc kubenswrapper[5094]: E0220 06:49:59.777336 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025\": container with ID starting with a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025 not found: ID does not exist" containerID="a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.777430 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025"} err="failed to get container status \"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025\": rpc error: code = NotFound desc = could not find container \"a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025\": container with ID starting with a331bfe9d22fc560ff4e659b1f997870c2ddce54f3c7858303fd34aa21993025 not found: ID does not exist" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.777495 5094 scope.go:117] "RemoveContainer" containerID="8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9" Feb 20 06:49:59 crc kubenswrapper[5094]: E0220 06:49:59.778896 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9\": container with ID starting with 8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9 not found: ID does not exist" containerID="8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.778961 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9"} err="failed to get container status \"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9\": rpc error: code = NotFound desc = could not find container \"8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9\": container with ID starting with 8667a8c7af968dab77a30064c202565770814938ef52fa50a032b1a65aa555d9 not found: ID does not exist" Feb 20 06:49:59 crc kubenswrapper[5094]: I0220 06:49:59.849077 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" path="/var/lib/kubelet/pods/4aa46aec-af59-49c6-9ff5-a08df3d68e5b/volumes" Feb 20 06:50:01 crc kubenswrapper[5094]: I0220 06:50:01.490404 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerName="oauth-openshift" containerID="cri-o://756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734" gracePeriod=15 Feb 20 06:50:01 crc kubenswrapper[5094]: I0220 06:50:01.689991 5094 generic.go:334] "Generic (PLEG): container finished" podID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerID="756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734" exitCode=0 Feb 20 06:50:01 crc kubenswrapper[5094]: I0220 06:50:01.690051 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" event={"ID":"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c","Type":"ContainerDied","Data":"756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734"} Feb 20 06:50:01 crc kubenswrapper[5094]: I0220 06:50:01.995474 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167095 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167218 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167332 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167430 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167479 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167526 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167621 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167662 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167761 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167819 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167884 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.167976 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.168041 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.168109 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") pod \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\" (UID: \"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c\") " Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.171348 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.172319 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.172781 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.173457 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.173949 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.177518 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.178156 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq" (OuterVolumeSpecName: "kube-api-access-zfrvq") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "kube-api-access-zfrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.178339 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.179430 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.179642 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.180461 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.180752 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.181762 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.182021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" (UID: "3afdb82a-4e8f-4c6c-9552-e26c0d715a7c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269803 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269870 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269910 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269942 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269971 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.269997 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270027 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270054 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270079 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfrvq\" (UniqueName: \"kubernetes.io/projected/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-kube-api-access-zfrvq\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270107 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270135 5094 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270161 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270187 5094 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.270214 5094 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.701134 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" event={"ID":"3afdb82a-4e8f-4c6c-9552-e26c0d715a7c","Type":"ContainerDied","Data":"48ac16689b00193d6e154a981653d9fe7dd39018c0acc1a7610d05cb116747a3"} Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.701241 5094 scope.go:117] "RemoveContainer" containerID="756c75b3fc1e466feb787f451786b0673b156a7f8bc22ab6ff9765d0ad70b734" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.701391 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-bslb9" Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.746436 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:50:02 crc kubenswrapper[5094]: I0220 06:50:02.749735 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-bslb9"] Feb 20 06:50:03 crc kubenswrapper[5094]: I0220 06:50:03.848052 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" path="/var/lib/kubelet/pods/3afdb82a-4e8f-4c6c-9552-e26c0d715a7c/volumes" Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.107035 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.107119 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.107175 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.107916 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.108004 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f" gracePeriod=600 Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.718525 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f" exitCode=0 Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.718595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f"} Feb 20 06:50:04 crc kubenswrapper[5094]: I0220 06:50:04.718667 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645"} Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.277375 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-zqtz6"] Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278548 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278572 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278604 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="extract-utilities" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278618 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="extract-utilities" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278633 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="extract-content" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278645 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="extract-content" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278667 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="extract-utilities" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278680 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="extract-utilities" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278726 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerName="oauth-openshift" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278741 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerName="oauth-openshift" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278764 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278777 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: E0220 06:50:11.278798 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="extract-content" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278810 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="extract-content" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278966 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afdb82a-4e8f-4c6c-9552-e26c0d715a7c" containerName="oauth-openshift" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.278988 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4803c5cf-27e3-414a-8fa4-7e82730e311d" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.279010 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa46aec-af59-49c6-9ff5-a08df3d68e5b" containerName="registry-server" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.279654 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.289626 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.289637 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.289930 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.290099 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.290311 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.290355 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.290661 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.293167 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.293215 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.293252 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.293444 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.295388 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.310781 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.311384 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-zqtz6"] Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.322756 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.341272 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432745 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432811 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432840 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432861 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432885 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432911 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.432931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.433768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.433931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-policies\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.433987 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.434155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.434222 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.434317 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzcv4\" (UniqueName: \"kubernetes.io/projected/b1b4e8ed-3d3f-4742-805b-056836f1216d-kube-api-access-nzcv4\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.434440 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-dir\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536185 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536301 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536447 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzcv4\" (UniqueName: \"kubernetes.io/projected/b1b4e8ed-3d3f-4742-805b-056836f1216d-kube-api-access-nzcv4\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536492 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-dir\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536581 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536619 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536654 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536695 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536785 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-dir\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536876 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536940 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.536993 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.537095 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.537152 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-policies\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.537192 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.537748 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-service-ca\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.538835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.538968 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.539115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1b4e8ed-3d3f-4742-805b-056836f1216d-audit-policies\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.547154 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-login\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.547193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.548243 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-router-certs\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.551031 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.552160 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-session\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.552178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-error\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.552284 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.556295 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1b4e8ed-3d3f-4742-805b-056836f1216d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.558904 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzcv4\" (UniqueName: \"kubernetes.io/projected/b1b4e8ed-3d3f-4742-805b-056836f1216d-kube-api-access-nzcv4\") pod \"oauth-openshift-7b964c775c-zqtz6\" (UID: \"b1b4e8ed-3d3f-4742-805b-056836f1216d\") " pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:11 crc kubenswrapper[5094]: I0220 06:50:11.621846 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.073534 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7b964c775c-zqtz6"] Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.781300 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" event={"ID":"b1b4e8ed-3d3f-4742-805b-056836f1216d","Type":"ContainerStarted","Data":"cf1d1661019c22ef5dbd1c613cc1e5fb4ec2e5a7460b946130fe26bdcc99110c"} Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.781781 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.781812 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" event={"ID":"b1b4e8ed-3d3f-4742-805b-056836f1216d","Type":"ContainerStarted","Data":"5d2bb7f9c9b2825c507f1b0a31c03d0ab6fb140b332003f5d5188121804a3b9c"} Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.814646 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" podStartSLOduration=36.814611669 podStartE2EDuration="36.814611669s" podCreationTimestamp="2026-02-20 06:49:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:50:12.810722597 +0000 UTC m=+227.683349308" watchObservedRunningTime="2026-02-20 06:50:12.814611669 +0000 UTC m=+227.687238410" Feb 20 06:50:12 crc kubenswrapper[5094]: I0220 06:50:12.939763 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7b964c775c-zqtz6" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.822131 5094 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824009 5094 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824044 5094 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824221 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824238 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824251 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824260 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824272 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824283 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824298 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824308 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824321 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824334 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824357 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824385 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824534 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824548 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824559 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824573 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824587 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 20 06:50:24 crc kubenswrapper[5094]: E0220 06:50:24.824754 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824764 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.824907 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.825308 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826078 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826243 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826302 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826345 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.826396 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" gracePeriod=15 Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.830312 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.952574 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953055 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953109 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953518 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.953630 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:24 crc kubenswrapper[5094]: I0220 06:50:24.954612 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055550 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055632 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055696 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055783 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.055874 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056741 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056743 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056816 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056925 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.056976 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057040 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057063 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057119 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057136 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.057178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.883063 5094 generic.go:334] "Generic (PLEG): container finished" podID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" containerID="c577f6af9ab3888d7eaafd2ffe5bc4c1228acf1f9ea1fd93255848a9f2a96cbc" exitCode=0 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.883174 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6d73a928-b634-44c7-a3ca-8ffc9a40277e","Type":"ContainerDied","Data":"c577f6af9ab3888d7eaafd2ffe5bc4c1228acf1f9ea1fd93255848a9f2a96cbc"} Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.884885 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.888483 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.891670 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.892993 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" exitCode=0 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.893031 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" exitCode=0 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.893041 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" exitCode=0 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.893058 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" exitCode=2 Feb 20 06:50:25 crc kubenswrapper[5094]: I0220 06:50:25.893091 5094 scope.go:117] "RemoveContainer" containerID="f51fd3423d9b8342c50ce578789b64ee5f724f64351e9953baeacb647785f5f1" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.659301 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.660785 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.661442 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.661939 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.662347 5094 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:26 crc kubenswrapper[5094]: I0220 06:50:26.662391 5094 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.662948 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="200ms" Feb 20 06:50:26 crc kubenswrapper[5094]: E0220 06:50:26.864601 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="400ms" Feb 20 06:50:26 crc kubenswrapper[5094]: I0220 06:50:26.936301 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 06:50:27 crc kubenswrapper[5094]: E0220 06:50:27.266336 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="800ms" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.291543 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.292544 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.293405 5094 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.294069 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.296858 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.297414 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.298177 5094 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.314857 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.314893 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.314919 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315074 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315160 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315401 5094 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315432 5094 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.315442 5094 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.416796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") pod \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.416949 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") pod \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.417022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") pod \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\" (UID: \"6d73a928-b634-44c7-a3ca-8ffc9a40277e\") " Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.417127 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock" (OuterVolumeSpecName: "var-lock") pod "6d73a928-b634-44c7-a3ca-8ffc9a40277e" (UID: "6d73a928-b634-44c7-a3ca-8ffc9a40277e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.417266 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6d73a928-b634-44c7-a3ca-8ffc9a40277e" (UID: "6d73a928-b634-44c7-a3ca-8ffc9a40277e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.417363 5094 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.425000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6d73a928-b634-44c7-a3ca-8ffc9a40277e" (UID: "6d73a928-b634-44c7-a3ca-8ffc9a40277e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.519725 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.519769 5094 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d73a928-b634-44c7-a3ca-8ffc9a40277e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.848602 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.953206 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6d73a928-b634-44c7-a3ca-8ffc9a40277e","Type":"ContainerDied","Data":"86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f"} Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.953278 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86853f3a851a6c1e525b0770bd0f66e0826a27532e42c59551bee7e0157d5b8f" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.954911 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.961075 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.965007 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.966847 5094 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" exitCode=0 Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.966920 5094 scope.go:117] "RemoveContainer" containerID="99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.967007 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.967957 5094 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.968555 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.970736 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.970948 5094 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:27 crc kubenswrapper[5094]: I0220 06:50:27.990931 5094 scope.go:117] "RemoveContainer" containerID="be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.010425 5094 scope.go:117] "RemoveContainer" containerID="64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.030329 5094 scope.go:117] "RemoveContainer" containerID="2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.047475 5094 scope.go:117] "RemoveContainer" containerID="e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.067397 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="1.6s" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.078032 5094 scope.go:117] "RemoveContainer" containerID="c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.100601 5094 scope.go:117] "RemoveContainer" containerID="99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.101287 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\": container with ID starting with 99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23 not found: ID does not exist" containerID="99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.101338 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23"} err="failed to get container status \"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\": rpc error: code = NotFound desc = could not find container \"99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23\": container with ID starting with 99ffcf225cd70a5f1686668ca4bc3678037b1360ad7f13d73f3049cb006c8c23 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.101378 5094 scope.go:117] "RemoveContainer" containerID="be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.101697 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\": container with ID starting with be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3 not found: ID does not exist" containerID="be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.101731 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3"} err="failed to get container status \"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\": rpc error: code = NotFound desc = could not find container \"be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3\": container with ID starting with be033e9491fc6dbdc8b945199da62e82054061e825ba841f2784b8ba92251ca3 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.101748 5094 scope.go:117] "RemoveContainer" containerID="64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.102026 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\": container with ID starting with 64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1 not found: ID does not exist" containerID="64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102055 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1"} err="failed to get container status \"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\": rpc error: code = NotFound desc = could not find container \"64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1\": container with ID starting with 64528527771ddbfd8258f8125a6b80cef8826e12a5207b6505a30ee7758a1ce1 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102079 5094 scope.go:117] "RemoveContainer" containerID="2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.102361 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\": container with ID starting with 2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2 not found: ID does not exist" containerID="2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102384 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2"} err="failed to get container status \"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\": rpc error: code = NotFound desc = could not find container \"2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2\": container with ID starting with 2283d24b9b88f0c8c415defef66c3e1405414d31655d9438c9479ad565c539d2 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102398 5094 scope.go:117] "RemoveContainer" containerID="e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.102806 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\": container with ID starting with e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781 not found: ID does not exist" containerID="e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102828 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781"} err="failed to get container status \"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\": rpc error: code = NotFound desc = could not find container \"e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781\": container with ID starting with e26a78cfed6941919466c1832a09bc61574959d53031003abcaf84fdc3e52781 not found: ID does not exist" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.102842 5094 scope.go:117] "RemoveContainer" containerID="c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326" Feb 20 06:50:28 crc kubenswrapper[5094]: E0220 06:50:28.103120 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\": container with ID starting with c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326 not found: ID does not exist" containerID="c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326" Feb 20 06:50:28 crc kubenswrapper[5094]: I0220 06:50:28.103160 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326"} err="failed to get container status \"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\": rpc error: code = NotFound desc = could not find container \"c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326\": container with ID starting with c205f71a46e915aea88675741084a7614a4f464e51430e483dd51791c7fa6326 not found: ID does not exist" Feb 20 06:50:29 crc kubenswrapper[5094]: E0220 06:50:29.669093 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="3.2s" Feb 20 06:50:29 crc kubenswrapper[5094]: E0220 06:50:29.862085 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:29 crc kubenswrapper[5094]: I0220 06:50:29.862535 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:29 crc kubenswrapper[5094]: W0220 06:50:29.886481 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6b473ffdced6481c794ee94c8670d7c42606b6651c103fea1f98f03e6c985511 WatchSource:0}: Error finding container 6b473ffdced6481c794ee94c8670d7c42606b6651c103fea1f98f03e6c985511: Status 404 returned error can't find the container with id 6b473ffdced6481c794ee94c8670d7c42606b6651c103fea1f98f03e6c985511 Feb 20 06:50:29 crc kubenswrapper[5094]: E0220 06:50:29.890443 5094 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895e1bce73e4261 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 06:50:29.889876577 +0000 UTC m=+244.762503318,LastTimestamp:2026-02-20 06:50:29.889876577 +0000 UTC m=+244.762503318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 06:50:29 crc kubenswrapper[5094]: I0220 06:50:29.986765 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6b473ffdced6481c794ee94c8670d7c42606b6651c103fea1f98f03e6c985511"} Feb 20 06:50:30 crc kubenswrapper[5094]: I0220 06:50:30.996385 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375"} Feb 20 06:50:30 crc kubenswrapper[5094]: E0220 06:50:30.997756 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:30 crc kubenswrapper[5094]: I0220 06:50:30.997972 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:32 crc kubenswrapper[5094]: E0220 06:50:32.004137 5094 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:50:32 crc kubenswrapper[5094]: E0220 06:50:32.870342 5094 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.188:6443: connect: connection refused" interval="6.4s" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.840067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.843432 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.843855 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.869623 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.869690 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:35 crc kubenswrapper[5094]: E0220 06:50:35.870436 5094 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:35 crc kubenswrapper[5094]: I0220 06:50:35.871443 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:35 crc kubenswrapper[5094]: W0220 06:50:35.909531 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-dba34e838371845fa77bd0ebbc8e1a51372d4ff7dca8def5475a7a6f7bf35dd5 WatchSource:0}: Error finding container dba34e838371845fa77bd0ebbc8e1a51372d4ff7dca8def5475a7a6f7bf35dd5: Status 404 returned error can't find the container with id dba34e838371845fa77bd0ebbc8e1a51372d4ff7dca8def5475a7a6f7bf35dd5 Feb 20 06:50:36 crc kubenswrapper[5094]: I0220 06:50:36.034003 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dba34e838371845fa77bd0ebbc8e1a51372d4ff7dca8def5475a7a6f7bf35dd5"} Feb 20 06:50:36 crc kubenswrapper[5094]: E0220 06:50:36.397417 5094 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.188:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1895e1bce73e4261 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-20 06:50:29.889876577 +0000 UTC m=+244.762503318,LastTimestamp:2026-02-20 06:50:29.889876577 +0000 UTC m=+244.762503318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.047033 5094 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="494dbad7b7d8411ed8e984463240b80543a607b517f185d55b75bf813eeccbbe" exitCode=0 Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.047316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"494dbad7b7d8411ed8e984463240b80543a607b517f185d55b75bf813eeccbbe"} Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.047929 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.049413 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:37 crc kubenswrapper[5094]: I0220 06:50:37.048450 5094 status_manager.go:851] "Failed to get status for pod" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" Feb 20 06:50:37 crc kubenswrapper[5094]: E0220 06:50:37.050577 5094 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.188:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:38 crc kubenswrapper[5094]: I0220 06:50:38.058270 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"af8420f0b5117c09a4cdb5c47e69a98711286d7eadfe72e752164d708b1a2b0f"} Feb 20 06:50:38 crc kubenswrapper[5094]: I0220 06:50:38.058608 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c7f800776012017ad1d052ce16777d8304bc3ad26b7fbf28db344ce7a35b0301"} Feb 20 06:50:38 crc kubenswrapper[5094]: I0220 06:50:38.058618 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"904e144ecb263d525941a4e1ab624d7e8a91bfc293db2fc5f07d80e51b0cd0b8"} Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.066915 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"25b9b98c9f4fc705d901f9620c1677b6dd9067458524adc0ca91eb739c7ee4af"} Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.067686 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5a04797bd174bb60faa197510196c0308fed7b84b3154b23d2486c87c8ede926"} Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.067186 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.067793 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.067731 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.070043 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.070091 5094 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d" exitCode=1 Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.070120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d"} Feb 20 06:50:39 crc kubenswrapper[5094]: I0220 06:50:39.070592 5094 scope.go:117] "RemoveContainer" containerID="124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d" Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.080124 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.080195 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c"} Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.871740 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.871939 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:40 crc kubenswrapper[5094]: I0220 06:50:40.881328 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:44 crc kubenswrapper[5094]: I0220 06:50:44.081204 5094 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:44 crc kubenswrapper[5094]: I0220 06:50:44.111295 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:44 crc kubenswrapper[5094]: I0220 06:50:44.111345 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:44 crc kubenswrapper[5094]: I0220 06:50:44.116676 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:45 crc kubenswrapper[5094]: I0220 06:50:45.127897 5094 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:45 crc kubenswrapper[5094]: I0220 06:50:45.127952 5094 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="069b1776-8adf-4339-bde2-43375d702571" Feb 20 06:50:45 crc kubenswrapper[5094]: I0220 06:50:45.861789 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6686d6e2-d4b4-4cc4-a557-70ca70f590a8" Feb 20 06:50:46 crc kubenswrapper[5094]: I0220 06:50:46.785443 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:50:46 crc kubenswrapper[5094]: I0220 06:50:46.785821 5094 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 06:50:46 crc kubenswrapper[5094]: I0220 06:50:46.785938 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 06:50:47 crc kubenswrapper[5094]: I0220 06:50:47.960664 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:50:50 crc kubenswrapper[5094]: I0220 06:50:50.465554 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 20 06:50:54 crc kubenswrapper[5094]: I0220 06:50:54.419047 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 20 06:50:54 crc kubenswrapper[5094]: I0220 06:50:54.555911 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 06:50:55 crc kubenswrapper[5094]: I0220 06:50:55.082528 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 20 06:50:55 crc kubenswrapper[5094]: I0220 06:50:55.097443 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.115793 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.248885 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.266748 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.302823 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.624943 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.655084 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.685415 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.785106 5094 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.785183 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 06:50:56 crc kubenswrapper[5094]: I0220 06:50:56.901445 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.144736 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.218046 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.309931 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.506356 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.583564 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.864900 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.955514 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 06:50:57 crc kubenswrapper[5094]: I0220 06:50:57.967571 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.048001 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.049127 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.132559 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.192922 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.408002 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.438496 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.515360 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.567680 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.586764 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.612183 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.635287 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.884760 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.885878 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 20 06:50:58 crc kubenswrapper[5094]: I0220 06:50:58.946331 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.034098 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.045088 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.048476 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.057033 5094 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.064935 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.065012 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.070529 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.096962 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=15.096930181 podStartE2EDuration="15.096930181s" podCreationTimestamp="2026-02-20 06:50:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:50:59.088630812 +0000 UTC m=+273.961257533" watchObservedRunningTime="2026-02-20 06:50:59.096930181 +0000 UTC m=+273.969556932" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.296152 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.314943 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.468809 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.640777 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.821527 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.910165 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.949919 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 20 06:50:59 crc kubenswrapper[5094]: I0220 06:50:59.950100 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.051163 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.109000 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.123021 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.137855 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.428722 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.503481 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.530481 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.627122 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.636651 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.729028 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.752492 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.851078 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.954629 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 20 06:51:00 crc kubenswrapper[5094]: I0220 06:51:00.988923 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.036977 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.167653 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.253139 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.258248 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.308434 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.309853 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.314519 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.315773 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.370509 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.376271 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.415421 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.533245 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.540886 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.589051 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.632649 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.655815 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.765257 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.823632 5094 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 20 06:51:01 crc kubenswrapper[5094]: I0220 06:51:01.915141 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.163004 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.268872 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.289773 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.292637 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.347969 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.390055 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.416467 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.436899 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.503949 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.524550 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.581685 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.616192 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.631128 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.645562 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.647202 5094 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.697460 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.710664 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.716022 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.818920 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 20 06:51:02 crc kubenswrapper[5094]: I0220 06:51:02.944975 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.011241 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.020006 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.089734 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.124747 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.315022 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.384020 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.418874 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.491243 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.522107 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.662343 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.758862 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.767774 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.770348 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.826495 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.848698 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.871270 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.906869 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.954293 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 20 06:51:03 crc kubenswrapper[5094]: I0220 06:51:03.954728 5094 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.076459 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.096674 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.102752 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.233215 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.236299 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.297044 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.328150 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.393674 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.458327 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.488788 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.488891 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.777381 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.784231 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.785330 5094 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.816134 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 20 06:51:04 crc kubenswrapper[5094]: I0220 06:51:04.958384 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.062325 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.064542 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.115683 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.196469 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.240683 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.247627 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.392986 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.417137 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.439455 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.454620 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.476599 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.484145 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.678838 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.760606 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.878844 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.903073 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.917464 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 20 06:51:05 crc kubenswrapper[5094]: I0220 06:51:05.927588 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.021023 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.055023 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.108272 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.157350 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.311141 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.342646 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.349884 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.356691 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.451244 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.452437 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.455682 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.466592 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.501054 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.555911 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.570958 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.618794 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.677834 5094 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.678106 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" gracePeriod=5 Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.738038 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.744903 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.785374 5094 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.785499 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.785609 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.787203 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.787436 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c" gracePeriod=30 Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.801899 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.836347 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.960603 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 20 06:51:06 crc kubenswrapper[5094]: I0220 06:51:06.968265 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.020786 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.096485 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.133827 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.276149 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.357584 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.449672 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.452372 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.606564 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.609088 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.610915 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.611349 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.627455 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.756686 5094 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.803360 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.817604 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.904776 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.937494 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 20 06:51:07 crc kubenswrapper[5094]: I0220 06:51:07.954897 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.016666 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.111738 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.122522 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.171486 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.361615 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.485565 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.502052 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.539984 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.542029 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.618900 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.685935 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.709796 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.768874 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.824647 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.837363 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.843656 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.849856 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.908011 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.912686 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 20 06:51:08 crc kubenswrapper[5094]: I0220 06:51:08.918375 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.045315 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.058879 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.061692 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.086251 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.215300 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.296588 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.317801 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.361026 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.534192 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.616689 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.746772 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.785467 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.792452 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.852665 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 20 06:51:09 crc kubenswrapper[5094]: I0220 06:51:09.883965 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.100693 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.143518 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.171183 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.262476 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.315242 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.414592 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.483012 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.601470 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.646986 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 20 06:51:10 crc kubenswrapper[5094]: I0220 06:51:10.734194 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.067035 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.344641 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.353646 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.515693 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.592573 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.665002 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.722441 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 20 06:51:11 crc kubenswrapper[5094]: I0220 06:51:11.804886 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.132646 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.284403 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.284573 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346002 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346092 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346140 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346414 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346620 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.346832 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.347294 5094 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.347329 5094 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.347355 5094 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.347379 5094 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.360250 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.361094 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.361170 5094 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" exitCode=137 Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.361243 5094 scope.go:117] "RemoveContainer" containerID="f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.361442 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.382397 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.428198 5094 scope.go:117] "RemoveContainer" containerID="f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" Feb 20 06:51:12 crc kubenswrapper[5094]: E0220 06:51:12.428925 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375\": container with ID starting with f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375 not found: ID does not exist" containerID="f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.429008 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375"} err="failed to get container status \"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375\": rpc error: code = NotFound desc = could not find container \"f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375\": container with ID starting with f56fc184c72efdb0715102f2218438947dc7e97e505f7b2b3b9699c62410b375 not found: ID does not exist" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.449423 5094 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.800443 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 20 06:51:12 crc kubenswrapper[5094]: I0220 06:51:12.821385 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 20 06:51:13 crc kubenswrapper[5094]: I0220 06:51:13.853221 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 20 06:51:14 crc kubenswrapper[5094]: I0220 06:51:14.332131 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 20 06:51:14 crc kubenswrapper[5094]: I0220 06:51:14.672259 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 20 06:51:25 crc kubenswrapper[5094]: I0220 06:51:25.609567 5094 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.549873 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.554426 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.554806 5094 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c" exitCode=137 Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.554866 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a44df77fb674c081b2ea537a72225268301f6777f79712ac42b7047fea3be20c"} Feb 20 06:51:37 crc kubenswrapper[5094]: I0220 06:51:37.554916 5094 scope.go:117] "RemoveContainer" containerID="124bba28cff60a62e8bc9050ffdbd6d67703ddbc8adf2195f874ba4289c8b50d" Feb 20 06:51:38 crc kubenswrapper[5094]: I0220 06:51:38.567069 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 20 06:51:38 crc kubenswrapper[5094]: I0220 06:51:38.570911 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6f9d419f2b468dd916110a489ee71eeb5463e527d671db8c7f43354327174777"} Feb 20 06:51:46 crc kubenswrapper[5094]: I0220 06:51:46.785152 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:46 crc kubenswrapper[5094]: I0220 06:51:46.790773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:47 crc kubenswrapper[5094]: I0220 06:51:47.631167 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:47 crc kubenswrapper[5094]: I0220 06:51:47.637488 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.755187 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.757023 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" containerID="cri-o://bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" gracePeriod=30 Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.763456 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.763529 5094 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-qrtpl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.763561 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.763688 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" containerID="cri-o://0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" gracePeriod=30 Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783293 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8zvf9"] Feb 20 06:51:57 crc kubenswrapper[5094]: E0220 06:51:57.783534 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783548 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 06:51:57 crc kubenswrapper[5094]: E0220 06:51:57.783567 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" containerName="installer" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783574 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" containerName="installer" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783674 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.783690 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d73a928-b634-44c7-a3ca-8ffc9a40277e" containerName="installer" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.784140 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.803557 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8zvf9"] Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933012 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f210823-4d80-4b35-aaef-bb100cf601dd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933077 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f210823-4d80-4b35-aaef-bb100cf601dd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933127 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-tls\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933179 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-certificates\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933199 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-bound-sa-token\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933218 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhlpm\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-kube-api-access-lhlpm\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.933247 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-trusted-ca\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:57 crc kubenswrapper[5094]: I0220 06:51:57.973024 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034473 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-tls\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034535 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-certificates\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034557 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-bound-sa-token\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034576 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhlpm\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-kube-api-access-lhlpm\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034606 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-trusted-ca\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034639 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f210823-4d80-4b35-aaef-bb100cf601dd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.034666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f210823-4d80-4b35-aaef-bb100cf601dd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.039226 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-trusted-ca\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.039593 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4f210823-4d80-4b35-aaef-bb100cf601dd-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.040124 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-certificates\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.048677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4f210823-4d80-4b35-aaef-bb100cf601dd-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.051693 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-registry-tls\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.059995 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-bound-sa-token\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.066429 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhlpm\" (UniqueName: \"kubernetes.io/projected/4f210823-4d80-4b35-aaef-bb100cf601dd-kube-api-access-lhlpm\") pod \"image-registry-66df7c8f76-8zvf9\" (UID: \"4f210823-4d80-4b35-aaef-bb100cf601dd\") " pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.100735 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.185126 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.245982 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.276951 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") pod \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") pod \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277566 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277623 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277656 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277812 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277858 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") pod \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277922 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") pod \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\" (UID: \"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.277941 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") pod \"18cc290d-78be-42c6-af5b-3b8b86941eb2\" (UID: \"18cc290d-78be-42c6-af5b-3b8b86941eb2\") " Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.278887 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config" (OuterVolumeSpecName: "config") pod "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" (UID: "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.279273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.279400 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config" (OuterVolumeSpecName: "config") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.279740 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca" (OuterVolumeSpecName: "client-ca") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.280013 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca" (OuterVolumeSpecName: "client-ca") pod "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" (UID: "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.284122 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr" (OuterVolumeSpecName: "kube-api-access-fmphr") pod "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" (UID: "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5"). InnerVolumeSpecName "kube-api-access-fmphr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.285396 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" (UID: "d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.285507 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn" (OuterVolumeSpecName: "kube-api-access-6vnhn") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "kube-api-access-6vnhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.285608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "18cc290d-78be-42c6-af5b-3b8b86941eb2" (UID: "18cc290d-78be-42c6-af5b-3b8b86941eb2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.372989 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8zvf9"] Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.379957 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vnhn\" (UniqueName: \"kubernetes.io/projected/18cc290d-78be-42c6-af5b-3b8b86941eb2-kube-api-access-6vnhn\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380009 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380023 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmphr\" (UniqueName: \"kubernetes.io/projected/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-kube-api-access-fmphr\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380045 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380089 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380158 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380172 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380184 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/18cc290d-78be-42c6-af5b-3b8b86941eb2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.380217 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18cc290d-78be-42c6-af5b-3b8b86941eb2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:58 crc kubenswrapper[5094]: W0220 06:51:58.382976 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f210823_4d80_4b35_aaef_bb100cf601dd.slice/crio-7948b22984973673df41f6d46e1722609ad42ee07f39532b50fde3af57f4ba94 WatchSource:0}: Error finding container 7948b22984973673df41f6d46e1722609ad42ee07f39532b50fde3af57f4ba94: Status 404 returned error can't find the container with id 7948b22984973673df41f6d46e1722609ad42ee07f39532b50fde3af57f4ba94 Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.703929 5094 generic.go:334] "Generic (PLEG): container finished" podID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerID="bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" exitCode=0 Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.704047 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" event={"ID":"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5","Type":"ContainerDied","Data":"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.704046 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.704113 5094 scope.go:117] "RemoveContainer" containerID="bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.704094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl" event={"ID":"d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5","Type":"ContainerDied","Data":"94f008c3fadb5b12936b7080560c6d49126d29712d46d44cae8050bc4be3dbe7"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.707504 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" event={"ID":"4f210823-4d80-4b35-aaef-bb100cf601dd","Type":"ContainerStarted","Data":"eb6f5628caae9cda47a27fe82dedee5a5710337704919079e0bcc13a086126fc"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.707551 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" event={"ID":"4f210823-4d80-4b35-aaef-bb100cf601dd","Type":"ContainerStarted","Data":"7948b22984973673df41f6d46e1722609ad42ee07f39532b50fde3af57f4ba94"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.707661 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.711507 5094 generic.go:334] "Generic (PLEG): container finished" podID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerID="0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" exitCode=0 Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.711573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" event={"ID":"18cc290d-78be-42c6-af5b-3b8b86941eb2","Type":"ContainerDied","Data":"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.711609 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" event={"ID":"18cc290d-78be-42c6-af5b-3b8b86941eb2","Type":"ContainerDied","Data":"a8a623d9daef3ed6dfb32889d5aba7e415d276108ff677b5411cb61659224d60"} Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.711678 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-fnbl8" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.731979 5094 scope.go:117] "RemoveContainer" containerID="bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" Feb 20 06:51:58 crc kubenswrapper[5094]: E0220 06:51:58.734096 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c\": container with ID starting with bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c not found: ID does not exist" containerID="bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.734170 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c"} err="failed to get container status \"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c\": rpc error: code = NotFound desc = could not find container \"bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c\": container with ID starting with bf96c34879e8b3b8b43d54fe3cde7b504e57f4682c37550a83f12cccfaf7d07c not found: ID does not exist" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.734229 5094 scope.go:117] "RemoveContainer" containerID="0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.735326 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" podStartSLOduration=1.7353025899999999 podStartE2EDuration="1.73530259s" podCreationTimestamp="2026-02-20 06:51:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:51:58.73237331 +0000 UTC m=+333.605000021" watchObservedRunningTime="2026-02-20 06:51:58.73530259 +0000 UTC m=+333.607929311" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.749117 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.758137 5094 scope.go:117] "RemoveContainer" containerID="0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.760044 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-qrtpl"] Feb 20 06:51:58 crc kubenswrapper[5094]: E0220 06:51:58.761857 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02\": container with ID starting with 0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02 not found: ID does not exist" containerID="0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.761904 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02"} err="failed to get container status \"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02\": rpc error: code = NotFound desc = could not find container \"0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02\": container with ID starting with 0f59159cf70471ec88327f095916949d208d3b4159ba3e8002ae3f04cc1b0b02 not found: ID does not exist" Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.769353 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:51:58 crc kubenswrapper[5094]: I0220 06:51:58.774525 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-fnbl8"] Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.376539 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.377817 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.377869 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.377930 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.377951 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.378250 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" containerName="route-controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.378303 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" containerName="controller-manager" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.379332 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.380667 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.381790 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.382739 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.382881 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.382895 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.385655 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393460 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393532 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393534 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393569 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393633 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393667 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393715 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393744 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.393546 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.394673 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.394970 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.394975 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.395014 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.395175 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.395590 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.404855 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.407106 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.414213 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.490768 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.491447 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-6mvmt proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" podUID="6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497268 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497411 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497446 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497585 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497610 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.497671 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.498003 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.498564 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-2mxsb serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" podUID="490c709b-3530-46b0-9418-fb8e74f8ea3d" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.499913 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.500139 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.500477 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.500677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.504547 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.505690 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.513485 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.525788 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") pod \"controller-manager-bbbbdff7c-c2w7p\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.536841 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") pod \"route-controller-manager-67fd64d77-2jt4h\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.722393 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.722455 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.733682 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.747977 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801767 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") pod \"490c709b-3530-46b0-9418-fb8e74f8ea3d\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801836 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") pod \"490c709b-3530-46b0-9418-fb8e74f8ea3d\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801915 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801937 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") pod \"490c709b-3530-46b0-9418-fb8e74f8ea3d\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801963 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.801987 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") pod \"490c709b-3530-46b0-9418-fb8e74f8ea3d\" (UID: \"490c709b-3530-46b0-9418-fb8e74f8ea3d\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.802022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") pod \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\" (UID: \"6cdcb672-e1fd-4cb0-8721-8327b1bca1f2\") " Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.803382 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config" (OuterVolumeSpecName: "config") pod "490c709b-3530-46b0-9418-fb8e74f8ea3d" (UID: "490c709b-3530-46b0-9418-fb8e74f8ea3d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.803935 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca" (OuterVolumeSpecName: "client-ca") pod "490c709b-3530-46b0-9418-fb8e74f8ea3d" (UID: "490c709b-3530-46b0-9418-fb8e74f8ea3d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.804246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config" (OuterVolumeSpecName: "config") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.805082 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca" (OuterVolumeSpecName: "client-ca") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.805273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.805780 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "490c709b-3530-46b0-9418-fb8e74f8ea3d" (UID: "490c709b-3530-46b0-9418-fb8e74f8ea3d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.807272 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb" (OuterVolumeSpecName: "kube-api-access-2mxsb") pod "490c709b-3530-46b0-9418-fb8e74f8ea3d" (UID: "490c709b-3530-46b0-9418-fb8e74f8ea3d"). InnerVolumeSpecName "kube-api-access-2mxsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.807860 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt" (OuterVolumeSpecName: "kube-api-access-6mvmt") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "kube-api-access-6mvmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.809115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" (UID: "6cdcb672-e1fd-4cb0-8721-8327b1bca1f2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.850594 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18cc290d-78be-42c6-af5b-3b8b86941eb2" path="/var/lib/kubelet/pods/18cc290d-78be-42c6-af5b-3b8b86941eb2/volumes" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.851530 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5" path="/var/lib/kubelet/pods/d63cfc9b-e8ee-4b2a-8f36-f335dc660ca5/volumes" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902784 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902816 5094 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902826 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902836 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mxsb\" (UniqueName: \"kubernetes.io/projected/490c709b-3530-46b0-9418-fb8e74f8ea3d-kube-api-access-2mxsb\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902848 5094 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/490c709b-3530-46b0-9418-fb8e74f8ea3d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902863 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902872 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mvmt\" (UniqueName: \"kubernetes.io/projected/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-kube-api-access-6mvmt\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902881 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/490c709b-3530-46b0-9418-fb8e74f8ea3d-config\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: I0220 06:51:59.902889 5094 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 20 06:51:59 crc kubenswrapper[5094]: E0220 06:51:59.932004 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod490c709b_3530_46b0_9418_fb8e74f8ea3d.slice\": RecentStats: unable to find data in memory cache]" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.728343 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.728370 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.760974 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.765198 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67fd64d77-2jt4h"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.769895 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.770860 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.780674 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.781175 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.781175 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.787016 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.787820 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.787880 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.792797 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.805406 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.810366 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bbbbdff7c-c2w7p"] Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.815724 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-config\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.815766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-client-ca\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.815820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5225\" (UniqueName: \"kubernetes.io/projected/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-kube-api-access-q5225\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.815856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-serving-cert\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.917783 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-config\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.917837 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-client-ca\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.917920 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5225\" (UniqueName: \"kubernetes.io/projected/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-kube-api-access-q5225\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.917957 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-serving-cert\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.918971 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-client-ca\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.919307 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-config\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.922107 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-serving-cert\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:00 crc kubenswrapper[5094]: I0220 06:52:00.933120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5225\" (UniqueName: \"kubernetes.io/projected/3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3-kube-api-access-q5225\") pod \"route-controller-manager-54fdd8b649-fzlw8\" (UID: \"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3\") " pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.094361 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.389848 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8"] Feb 20 06:52:01 crc kubenswrapper[5094]: W0220 06:52:01.397262 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a79b08a_58f4_4ce4_a969_8cd6dd46bcb3.slice/crio-ee4a956134fd19cd37a0a5f856dfe65473a79b1a02057f20124d1d3afe62d4f8 WatchSource:0}: Error finding container ee4a956134fd19cd37a0a5f856dfe65473a79b1a02057f20124d1d3afe62d4f8: Status 404 returned error can't find the container with id ee4a956134fd19cd37a0a5f856dfe65473a79b1a02057f20124d1d3afe62d4f8 Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.734946 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" event={"ID":"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3","Type":"ContainerStarted","Data":"50fc91aeb169884b1c2057fd672faa52f419ffac32eb752bd24ff76767186fe3"} Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.735004 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" event={"ID":"3a79b08a-58f4-4ce4-a969-8cd6dd46bcb3","Type":"ContainerStarted","Data":"ee4a956134fd19cd37a0a5f856dfe65473a79b1a02057f20124d1d3afe62d4f8"} Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.735306 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.755807 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" podStartSLOduration=2.75578467 podStartE2EDuration="2.75578467s" podCreationTimestamp="2026-02-20 06:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:52:01.75286387 +0000 UTC m=+336.625490591" watchObservedRunningTime="2026-02-20 06:52:01.75578467 +0000 UTC m=+336.628411401" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.847802 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="490c709b-3530-46b0-9418-fb8e74f8ea3d" path="/var/lib/kubelet/pods/490c709b-3530-46b0-9418-fb8e74f8ea3d/volumes" Feb 20 06:52:01 crc kubenswrapper[5094]: I0220 06:52:01.848518 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cdcb672-e1fd-4cb0-8721-8327b1bca1f2" path="/var/lib/kubelet/pods/6cdcb672-e1fd-4cb0-8721-8327b1bca1f2/volumes" Feb 20 06:52:02 crc kubenswrapper[5094]: I0220 06:52:02.035891 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-54fdd8b649-fzlw8" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.394407 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj"] Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.395745 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.398311 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.398544 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.400656 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj"] Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.401672 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.401849 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.402319 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.402583 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.409862 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555616 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpttx\" (UniqueName: \"kubernetes.io/projected/c9baff55-48ba-47a8-9f75-ed2e819db14b-kube-api-access-xpttx\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9baff55-48ba-47a8-9f75-ed2e819db14b-serving-cert\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555734 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-config\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555758 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-proxy-ca-bundles\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.555784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-client-ca\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657555 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpttx\" (UniqueName: \"kubernetes.io/projected/c9baff55-48ba-47a8-9f75-ed2e819db14b-kube-api-access-xpttx\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657619 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9baff55-48ba-47a8-9f75-ed2e819db14b-serving-cert\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657720 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-config\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-proxy-ca-bundles\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.657781 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-client-ca\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.658983 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-client-ca\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.659856 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-config\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.660013 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9baff55-48ba-47a8-9f75-ed2e819db14b-proxy-ca-bundles\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.671506 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9baff55-48ba-47a8-9f75-ed2e819db14b-serving-cert\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.684182 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpttx\" (UniqueName: \"kubernetes.io/projected/c9baff55-48ba-47a8-9f75-ed2e819db14b-kube-api-access-xpttx\") pod \"controller-manager-6bbd8c4f55-srkvj\" (UID: \"c9baff55-48ba-47a8-9f75-ed2e819db14b\") " pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:03 crc kubenswrapper[5094]: I0220 06:52:03.734776 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.005293 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj"] Feb 20 06:52:04 crc kubenswrapper[5094]: W0220 06:52:04.023126 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9baff55_48ba_47a8_9f75_ed2e819db14b.slice/crio-169d20830fc8150f84b501fbf2b926b5d613948e72cb11bf713ef6fd9ba49dad WatchSource:0}: Error finding container 169d20830fc8150f84b501fbf2b926b5d613948e72cb11bf713ef6fd9ba49dad: Status 404 returned error can't find the container with id 169d20830fc8150f84b501fbf2b926b5d613948e72cb11bf713ef6fd9ba49dad Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.108026 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.108162 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.765872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" event={"ID":"c9baff55-48ba-47a8-9f75-ed2e819db14b","Type":"ContainerStarted","Data":"71ce46af593781e8d088039b07ecf8ad760da89d320957ceeb474c4e06c04dae"} Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.766309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" event={"ID":"c9baff55-48ba-47a8-9f75-ed2e819db14b","Type":"ContainerStarted","Data":"169d20830fc8150f84b501fbf2b926b5d613948e72cb11bf713ef6fd9ba49dad"} Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.766412 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.771905 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" Feb 20 06:52:04 crc kubenswrapper[5094]: I0220 06:52:04.794082 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bbd8c4f55-srkvj" podStartSLOduration=5.794061005 podStartE2EDuration="5.794061005s" podCreationTimestamp="2026-02-20 06:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:52:04.789318812 +0000 UTC m=+339.661945523" watchObservedRunningTime="2026-02-20 06:52:04.794061005 +0000 UTC m=+339.666687716" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.635457 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.636766 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zj2v" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="registry-server" containerID="cri-o://beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.648889 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.649158 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c94fj" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="registry-server" containerID="cri-o://0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.669281 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.669572 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" containerID="cri-o://ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.676424 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.676743 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jlc84" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="registry-server" containerID="cri-o://289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.692554 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.692876 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tpwcx" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" containerID="cri-o://42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" gracePeriod=30 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.698117 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8j9k"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.700555 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.722186 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8j9k"] Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.807211 5094 generic.go:334] "Generic (PLEG): container finished" podID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerID="beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4" exitCode=0 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.807281 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerDied","Data":"beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4"} Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.809585 5094 generic.go:334] "Generic (PLEG): container finished" podID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerID="ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c" exitCode=0 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.809649 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" event={"ID":"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60","Type":"ContainerDied","Data":"ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c"} Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.811865 5094 generic.go:334] "Generic (PLEG): container finished" podID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerID="289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c" exitCode=0 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.811925 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerDied","Data":"289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c"} Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.825690 5094 generic.go:334] "Generic (PLEG): container finished" podID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerID="0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836" exitCode=0 Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.825789 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerDied","Data":"0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836"} Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.884501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.884603 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.884627 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqs2q\" (UniqueName: \"kubernetes.io/projected/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-kube-api-access-wqs2q\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.985586 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.987387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.988339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqs2q\" (UniqueName: \"kubernetes.io/projected/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-kube-api-access-wqs2q\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.988563 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:09 crc kubenswrapper[5094]: I0220 06:52:09.998775 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.015864 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqs2q\" (UniqueName: \"kubernetes.io/projected/a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9-kube-api-access-wqs2q\") pod \"marketplace-operator-79b997595-j8j9k\" (UID: \"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9\") " pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.016688 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.175781 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.192363 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") pod \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.192429 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") pod \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.192532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") pod \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\" (UID: \"88e94523-c126-4ce8-a6c7-2f83eb91d3fc\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.193582 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities" (OuterVolumeSpecName: "utilities") pod "88e94523-c126-4ce8-a6c7-2f83eb91d3fc" (UID: "88e94523-c126-4ce8-a6c7-2f83eb91d3fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.207330 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5" (OuterVolumeSpecName: "kube-api-access-2f7t5") pod "88e94523-c126-4ce8-a6c7-2f83eb91d3fc" (UID: "88e94523-c126-4ce8-a6c7-2f83eb91d3fc"). InnerVolumeSpecName "kube-api-access-2f7t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.270423 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88e94523-c126-4ce8-a6c7-2f83eb91d3fc" (UID: "88e94523-c126-4ce8-a6c7-2f83eb91d3fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.299216 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f7t5\" (UniqueName: \"kubernetes.io/projected/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-kube-api-access-2f7t5\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.299253 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.299264 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88e94523-c126-4ce8-a6c7-2f83eb91d3fc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.299519 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.350130 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.353109 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.362551 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400437 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") pod \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400518 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") pod \"8ecc3e73-dd76-4a73-a366-92c78aca386e\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400574 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") pod \"8ecc3e73-dd76-4a73-a366-92c78aca386e\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") pod \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") pod \"66c35d10-d5cc-468f-95a1-b56fde3961b3\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400734 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") pod \"66c35d10-d5cc-468f-95a1-b56fde3961b3\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400770 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") pod \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") pod \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\" (UID: \"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400865 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") pod \"8ecc3e73-dd76-4a73-a366-92c78aca386e\" (UID: \"8ecc3e73-dd76-4a73-a366-92c78aca386e\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400899 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") pod \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400923 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") pod \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\" (UID: \"f5c1eecf-1cc2-4480-ac22-99a970f5dc58\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.400958 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") pod \"66c35d10-d5cc-468f-95a1-b56fde3961b3\" (UID: \"66c35d10-d5cc-468f-95a1-b56fde3961b3\") " Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.401712 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" (UID: "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.402687 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities" (OuterVolumeSpecName: "utilities") pod "8ecc3e73-dd76-4a73-a366-92c78aca386e" (UID: "8ecc3e73-dd76-4a73-a366-92c78aca386e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.405984 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" (UID: "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.410021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb" (OuterVolumeSpecName: "kube-api-access-dw4pb") pod "f5c1eecf-1cc2-4480-ac22-99a970f5dc58" (UID: "f5c1eecf-1cc2-4480-ac22-99a970f5dc58"). InnerVolumeSpecName "kube-api-access-dw4pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.411084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities" (OuterVolumeSpecName: "utilities") pod "f5c1eecf-1cc2-4480-ac22-99a970f5dc58" (UID: "f5c1eecf-1cc2-4480-ac22-99a970f5dc58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.413650 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp" (OuterVolumeSpecName: "kube-api-access-bxwcp") pod "66c35d10-d5cc-468f-95a1-b56fde3961b3" (UID: "66c35d10-d5cc-468f-95a1-b56fde3961b3"). InnerVolumeSpecName "kube-api-access-bxwcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.415531 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms" (OuterVolumeSpecName: "kube-api-access-s5cms") pod "8ecc3e73-dd76-4a73-a366-92c78aca386e" (UID: "8ecc3e73-dd76-4a73-a366-92c78aca386e"). InnerVolumeSpecName "kube-api-access-s5cms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.417021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities" (OuterVolumeSpecName: "utilities") pod "66c35d10-d5cc-468f-95a1-b56fde3961b3" (UID: "66c35d10-d5cc-468f-95a1-b56fde3961b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.423303 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg" (OuterVolumeSpecName: "kube-api-access-kj5hg") pod "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" (UID: "ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60"). InnerVolumeSpecName "kube-api-access-kj5hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.442633 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f5c1eecf-1cc2-4480-ac22-99a970f5dc58" (UID: "f5c1eecf-1cc2-4480-ac22-99a970f5dc58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.473566 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66c35d10-d5cc-468f-95a1-b56fde3961b3" (UID: "66c35d10-d5cc-468f-95a1-b56fde3961b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503262 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503306 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5cms\" (UniqueName: \"kubernetes.io/projected/8ecc3e73-dd76-4a73-a366-92c78aca386e-kube-api-access-s5cms\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503317 5094 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503328 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503339 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxwcp\" (UniqueName: \"kubernetes.io/projected/66c35d10-d5cc-468f-95a1-b56fde3961b3-kube-api-access-bxwcp\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503348 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503358 5094 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503368 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw4pb\" (UniqueName: \"kubernetes.io/projected/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-kube-api-access-dw4pb\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503377 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f5c1eecf-1cc2-4480-ac22-99a970f5dc58-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503387 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66c35d10-d5cc-468f-95a1-b56fde3961b3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.503396 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj5hg\" (UniqueName: \"kubernetes.io/projected/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60-kube-api-access-kj5hg\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.557302 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ecc3e73-dd76-4a73-a366-92c78aca386e" (UID: "8ecc3e73-dd76-4a73-a366-92c78aca386e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.598829 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j8j9k"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.604242 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ecc3e73-dd76-4a73-a366-92c78aca386e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.837830 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerID="42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" exitCode=0 Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.837916 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerDied","Data":"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.837931 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tpwcx" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.839605 5094 scope.go:117] "RemoveContainer" containerID="42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.839611 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tpwcx" event={"ID":"8ecc3e73-dd76-4a73-a366-92c78aca386e","Type":"ContainerDied","Data":"0e45d21f2e06751dfd4ed57ac1f4c3ab877bd9456401ea77210957c88c331288"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.851735 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jlc84" event={"ID":"f5c1eecf-1cc2-4480-ac22-99a970f5dc58","Type":"ContainerDied","Data":"b868f7016b99c578833cf3de0ff7d8c4b2b1a5f9351afa2373e2189f66239303"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.851852 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jlc84" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.856503 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c94fj" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.856502 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c94fj" event={"ID":"88e94523-c126-4ce8-a6c7-2f83eb91d3fc","Type":"ContainerDied","Data":"a1dea229adcbee55ddf6e0b41aedbdc8cf35c14a6f68369e5313338e917770ed"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.861578 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zj2v" event={"ID":"66c35d10-d5cc-468f-95a1-b56fde3961b3","Type":"ContainerDied","Data":"e17dd6ed8a9cb1673d9dc74ffad9c9a16fa5fcf970993588d1e5f6f1eb4c4133"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.861623 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zj2v" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.866941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" event={"ID":"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9","Type":"ContainerStarted","Data":"e34846bdab6e9a22c8340dffa1a148d16bdd358d036afce89d81c4ada172b476"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.866976 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" event={"ID":"a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9","Type":"ContainerStarted","Data":"ae843642a9b18bca47bf28346b209a715fd9f8e7e68ae0ddfbc1b9266a543482"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.867960 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.870086 5094 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j8j9k container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" start-of-body= Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.870131 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" podUID="a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.63:8080/healthz\": dial tcp 10.217.0.63:8080: connect: connection refused" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.872904 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" event={"ID":"ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60","Type":"ContainerDied","Data":"c96c01104aff3baab51a0a0d07e5aec2067827a35dc850409192571365bb82c1"} Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.873053 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-45dt8" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.878225 5094 scope.go:117] "RemoveContainer" containerID="94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.889415 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.893366 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tpwcx"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.911561 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" podStartSLOduration=1.911532254 podStartE2EDuration="1.911532254s" podCreationTimestamp="2026-02-20 06:52:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 06:52:10.903729218 +0000 UTC m=+345.776355949" watchObservedRunningTime="2026-02-20 06:52:10.911532254 +0000 UTC m=+345.784158985" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.928749 5094 scope.go:117] "RemoveContainer" containerID="30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.943482 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.961865 5094 scope.go:117] "RemoveContainer" containerID="42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" Feb 20 06:52:10 crc kubenswrapper[5094]: E0220 06:52:10.962885 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6\": container with ID starting with 42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6 not found: ID does not exist" containerID="42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.962925 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6"} err="failed to get container status \"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6\": rpc error: code = NotFound desc = could not find container \"42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6\": container with ID starting with 42ce9dcbffc673121e18daf4e5c35fd9d54f6c58c1b7c83b8be06d9c3a2788d6 not found: ID does not exist" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.962951 5094 scope.go:117] "RemoveContainer" containerID="94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799" Feb 20 06:52:10 crc kubenswrapper[5094]: E0220 06:52:10.964156 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799\": container with ID starting with 94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799 not found: ID does not exist" containerID="94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964180 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799"} err="failed to get container status \"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799\": rpc error: code = NotFound desc = could not find container \"94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799\": container with ID starting with 94a532b2709a2b2f9115437213f975f914284d32756c3ae440753867ceeb3799 not found: ID does not exist" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964196 5094 scope.go:117] "RemoveContainer" containerID="30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964365 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c94fj"] Feb 20 06:52:10 crc kubenswrapper[5094]: E0220 06:52:10.964615 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222\": container with ID starting with 30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222 not found: ID does not exist" containerID="30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964694 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222"} err="failed to get container status \"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222\": rpc error: code = NotFound desc = could not find container \"30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222\": container with ID starting with 30c497a9a14091198f358ef8a78f1c6b6da4ba9ab8621394e3d848ebf94da222 not found: ID does not exist" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.964784 5094 scope.go:117] "RemoveContainer" containerID="289c43eb8a0b4cdc450470847de0cf4c3dce76dfdb3673f02a9643bdf93ba57c" Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.976050 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.981502 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zj2v"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.997913 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:52:10 crc kubenswrapper[5094]: I0220 06:52:10.999521 5094 scope.go:117] "RemoveContainer" containerID="ca9993542532855c09ba40fb79d1b2ff1916ab7e330faa07724168697397276c" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.005542 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-45dt8"] Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.011031 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.016895 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jlc84"] Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.022840 5094 scope.go:117] "RemoveContainer" containerID="064b2c1894993da3d72d5837edcc05f078452e781464dc1e2a5cab9234eb9f15" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.043073 5094 scope.go:117] "RemoveContainer" containerID="0cfcc7a212abd3270f67b4c4a04afcaf890dc3482fbe7bdb289a52eed1f95836" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.063399 5094 scope.go:117] "RemoveContainer" containerID="a3085de07d8490c70b05f62d546c4844150be55db1f8b370f140f0fcadcb36da" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.080268 5094 scope.go:117] "RemoveContainer" containerID="6eb6b4fa4af198a75121d1f1d8845384553bbedcf18e198cdf00cc8282d9f5b7" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.097744 5094 scope.go:117] "RemoveContainer" containerID="beb8731903e3c8312abc22dea15752098a2c970cf1d17da2323d05db3f7ea0e4" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.112821 5094 scope.go:117] "RemoveContainer" containerID="61352ad384c7169a9a29e90c914460822eb2dc45803cccf5cac7d1c7d42a40b1" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.130527 5094 scope.go:117] "RemoveContainer" containerID="11182af2209155c05ce50ce6f5457662dfc62f8d75b0ebcee01f179c458884f9" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.156384 5094 scope.go:117] "RemoveContainer" containerID="ffaff9fdb2f03b0e17c5d470bccb1479bdc3ed514c6b3c3a9637e3af949a185c" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.865201 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" path="/var/lib/kubelet/pods/66c35d10-d5cc-468f-95a1-b56fde3961b3/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.867820 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" path="/var/lib/kubelet/pods/88e94523-c126-4ce8-a6c7-2f83eb91d3fc/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.869587 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" path="/var/lib/kubelet/pods/8ecc3e73-dd76-4a73-a366-92c78aca386e/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.872338 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" path="/var/lib/kubelet/pods/ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.873519 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" path="/var/lib/kubelet/pods/f5c1eecf-1cc2-4480-ac22-99a970f5dc58/volumes" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.874789 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zw5zl"] Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875140 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875183 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875225 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875243 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875270 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875289 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875313 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875331 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875348 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875365 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875380 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875393 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875414 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875427 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875445 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875458 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875477 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875493 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875508 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875520 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875537 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875554 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875571 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875584 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="extract-content" Feb 20 06:52:11 crc kubenswrapper[5094]: E0220 06:52:11.875607 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.875619 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="extract-utilities" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876856 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c1eecf-1cc2-4480-ac22-99a970f5dc58" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876912 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c35d10-d5cc-468f-95a1-b56fde3961b3" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876940 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ecc3e73-dd76-4a73-a366-92c78aca386e" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876968 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="88e94523-c126-4ce8-a6c7-2f83eb91d3fc" containerName="registry-server" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.876989 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3c622d-9c0e-43f5-a5ce-de2dbbab5f60" containerName="marketplace-operator" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.878652 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zw5zl"] Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.878892 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.883066 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 20 06:52:11 crc kubenswrapper[5094]: I0220 06:52:11.900124 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j8j9k" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.057830 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-utilities\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.057922 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-catalog-content\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.058014 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56hm5\" (UniqueName: \"kubernetes.io/projected/4ba2a013-0ac4-4983-92e6-875272450307-kube-api-access-56hm5\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.060641 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.062594 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.065816 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.072903 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160436 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160528 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160571 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-utilities\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-catalog-content\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.160752 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56hm5\" (UniqueName: \"kubernetes.io/projected/4ba2a013-0ac4-4983-92e6-875272450307-kube-api-access-56hm5\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.161070 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-utilities\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.161415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ba2a013-0ac4-4983-92e6-875272450307-catalog-content\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.201272 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56hm5\" (UniqueName: \"kubernetes.io/projected/4ba2a013-0ac4-4983-92e6-875272450307-kube-api-access-56hm5\") pod \"redhat-marketplace-zw5zl\" (UID: \"4ba2a013-0ac4-4983-92e6-875272450307\") " pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.201808 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.262279 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.262333 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.262386 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.262953 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.263130 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.285980 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") pod \"community-operators-nhpxw\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.381897 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.655474 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zw5zl"] Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.831835 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 06:52:12 crc kubenswrapper[5094]: W0220 06:52:12.840475 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod061991e0_0b0a_4e47_9275_e00b323e9fb2.slice/crio-e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f WatchSource:0}: Error finding container e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f: Status 404 returned error can't find the container with id e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.902846 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ba2a013-0ac4-4983-92e6-875272450307" containerID="7dc011fed8be0e19eb2f7338ebba3000b50cb6bde5998e42c6796930786d7ccc" exitCode=0 Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.902954 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerDied","Data":"7dc011fed8be0e19eb2f7338ebba3000b50cb6bde5998e42c6796930786d7ccc"} Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.903023 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerStarted","Data":"1c5ebaedc38619e3f93441f65705f34966ac3b27c07b83db392d20fb8c0b8942"} Feb 20 06:52:12 crc kubenswrapper[5094]: I0220 06:52:12.904756 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerStarted","Data":"e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f"} Feb 20 06:52:13 crc kubenswrapper[5094]: I0220 06:52:13.914096 5094 generic.go:334] "Generic (PLEG): container finished" podID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerID="258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a" exitCode=0 Feb 20 06:52:13 crc kubenswrapper[5094]: I0220 06:52:13.914195 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerDied","Data":"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a"} Feb 20 06:52:13 crc kubenswrapper[5094]: I0220 06:52:13.918269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerStarted","Data":"aacf0b6e60280be26e57e0c8961f993dc2cc741791b6fdeaf12dfe5d0c9f21d7"} Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.256084 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.257382 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.262064 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.266796 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.318683 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.318832 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.318912 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.421104 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.421210 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.421331 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.422022 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.422061 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.456856 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qcxs4"] Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.458762 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.460677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") pod \"certified-operators-tzpc7\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.461533 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.472057 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qcxs4"] Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.524276 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-utilities\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.524572 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-catalog-content\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.524983 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m48fn\" (UniqueName: \"kubernetes.io/projected/447e3a00-67d2-44c4-89cd-def383a3693d-kube-api-access-m48fn\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.588568 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.627829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-utilities\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.627904 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-catalog-content\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.627931 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m48fn\" (UniqueName: \"kubernetes.io/projected/447e3a00-67d2-44c4-89cd-def383a3693d-kube-api-access-m48fn\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.628957 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-utilities\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.629275 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/447e3a00-67d2-44c4-89cd-def383a3693d-catalog-content\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.647550 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m48fn\" (UniqueName: \"kubernetes.io/projected/447e3a00-67d2-44c4-89cd-def383a3693d-kube-api-access-m48fn\") pod \"redhat-operators-qcxs4\" (UID: \"447e3a00-67d2-44c4-89cd-def383a3693d\") " pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.802122 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.930563 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ba2a013-0ac4-4983-92e6-875272450307" containerID="aacf0b6e60280be26e57e0c8961f993dc2cc741791b6fdeaf12dfe5d0c9f21d7" exitCode=0 Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.930641 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerDied","Data":"aacf0b6e60280be26e57e0c8961f993dc2cc741791b6fdeaf12dfe5d0c9f21d7"} Feb 20 06:52:14 crc kubenswrapper[5094]: I0220 06:52:14.935356 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerStarted","Data":"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.032639 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 06:52:15 crc kubenswrapper[5094]: W0220 06:52:15.036262 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6db9ece_1aa7_4ea4_b800_b710a760edf6.slice/crio-b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f WatchSource:0}: Error finding container b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f: Status 404 returned error can't find the container with id b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.208026 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qcxs4"] Feb 20 06:52:15 crc kubenswrapper[5094]: W0220 06:52:15.216487 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod447e3a00_67d2_44c4_89cd_def383a3693d.slice/crio-85fa3465d015c7e21ff1075f0d4bc07fee5aef80c97796211e2764a44ea98d92 WatchSource:0}: Error finding container 85fa3465d015c7e21ff1075f0d4bc07fee5aef80c97796211e2764a44ea98d92: Status 404 returned error can't find the container with id 85fa3465d015c7e21ff1075f0d4bc07fee5aef80c97796211e2764a44ea98d92 Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.946232 5094 generic.go:334] "Generic (PLEG): container finished" podID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerID="0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0" exitCode=0 Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.946322 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerDied","Data":"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.946750 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerStarted","Data":"b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.950004 5094 generic.go:334] "Generic (PLEG): container finished" podID="447e3a00-67d2-44c4-89cd-def383a3693d" containerID="f62030d23eeaced6be390007f7315e448784f435f42fdbb04b42cabfa6e3035a" exitCode=0 Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.950159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerDied","Data":"f62030d23eeaced6be390007f7315e448784f435f42fdbb04b42cabfa6e3035a"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.950191 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerStarted","Data":"85fa3465d015c7e21ff1075f0d4bc07fee5aef80c97796211e2764a44ea98d92"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.955116 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zw5zl" event={"ID":"4ba2a013-0ac4-4983-92e6-875272450307","Type":"ContainerStarted","Data":"8b624fcc4bd17032c51f91b024fb53e5316783a2b5cdf796c45c17851b976620"} Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.957851 5094 generic.go:334] "Generic (PLEG): container finished" podID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerID="6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4" exitCode=0 Feb 20 06:52:15 crc kubenswrapper[5094]: I0220 06:52:15.957892 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerDied","Data":"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4"} Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.047561 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zw5zl" podStartSLOduration=2.582064351 podStartE2EDuration="5.047524658s" podCreationTimestamp="2026-02-20 06:52:11 +0000 UTC" firstStartedPulling="2026-02-20 06:52:12.905004322 +0000 UTC m=+347.777631033" lastFinishedPulling="2026-02-20 06:52:15.370464639 +0000 UTC m=+350.243091340" observedRunningTime="2026-02-20 06:52:16.045870809 +0000 UTC m=+350.918497520" watchObservedRunningTime="2026-02-20 06:52:16.047524658 +0000 UTC m=+350.920151379" Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.967003 5094 generic.go:334] "Generic (PLEG): container finished" podID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerID="b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc" exitCode=0 Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.967114 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerDied","Data":"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc"} Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.975010 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerStarted","Data":"4fae2eaa3d4ffc51aab4f89c4f82ef33cbcb1a20909f5dc51ac0898e5482ee62"} Feb 20 06:52:16 crc kubenswrapper[5094]: I0220 06:52:16.977768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerStarted","Data":"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd"} Feb 20 06:52:17 crc kubenswrapper[5094]: I0220 06:52:17.010143 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nhpxw" podStartSLOduration=2.610081958 podStartE2EDuration="5.010113381s" podCreationTimestamp="2026-02-20 06:52:12 +0000 UTC" firstStartedPulling="2026-02-20 06:52:13.915907879 +0000 UTC m=+348.788534600" lastFinishedPulling="2026-02-20 06:52:16.315939312 +0000 UTC m=+351.188566023" observedRunningTime="2026-02-20 06:52:17.008516443 +0000 UTC m=+351.881143154" watchObservedRunningTime="2026-02-20 06:52:17.010113381 +0000 UTC m=+351.882740102" Feb 20 06:52:17 crc kubenswrapper[5094]: I0220 06:52:17.986606 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerStarted","Data":"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb"} Feb 20 06:52:17 crc kubenswrapper[5094]: I0220 06:52:17.989965 5094 generic.go:334] "Generic (PLEG): container finished" podID="447e3a00-67d2-44c4-89cd-def383a3693d" containerID="4fae2eaa3d4ffc51aab4f89c4f82ef33cbcb1a20909f5dc51ac0898e5482ee62" exitCode=0 Feb 20 06:52:17 crc kubenswrapper[5094]: I0220 06:52:17.990098 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerDied","Data":"4fae2eaa3d4ffc51aab4f89c4f82ef33cbcb1a20909f5dc51ac0898e5482ee62"} Feb 20 06:52:18 crc kubenswrapper[5094]: I0220 06:52:18.015098 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tzpc7" podStartSLOduration=2.6231710550000003 podStartE2EDuration="4.015077687s" podCreationTimestamp="2026-02-20 06:52:14 +0000 UTC" firstStartedPulling="2026-02-20 06:52:15.949778813 +0000 UTC m=+350.822405544" lastFinishedPulling="2026-02-20 06:52:17.341685465 +0000 UTC m=+352.214312176" observedRunningTime="2026-02-20 06:52:18.013059258 +0000 UTC m=+352.885686009" watchObservedRunningTime="2026-02-20 06:52:18.015077687 +0000 UTC m=+352.887704398" Feb 20 06:52:18 crc kubenswrapper[5094]: I0220 06:52:18.106236 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8zvf9" Feb 20 06:52:18 crc kubenswrapper[5094]: I0220 06:52:18.168962 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:52:18 crc kubenswrapper[5094]: I0220 06:52:18.997415 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qcxs4" event={"ID":"447e3a00-67d2-44c4-89cd-def383a3693d","Type":"ContainerStarted","Data":"5d1a70b01add6fd3acd49cff0d943afdd7848ce36a44eeba122d28a271fa3ccd"} Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.202788 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.202865 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.265533 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.290968 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qcxs4" podStartSLOduration=5.857528876 podStartE2EDuration="8.290946927s" podCreationTimestamp="2026-02-20 06:52:14 +0000 UTC" firstStartedPulling="2026-02-20 06:52:15.955894139 +0000 UTC m=+350.828520870" lastFinishedPulling="2026-02-20 06:52:18.38931221 +0000 UTC m=+353.261938921" observedRunningTime="2026-02-20 06:52:19.031933887 +0000 UTC m=+353.904560608" watchObservedRunningTime="2026-02-20 06:52:22.290946927 +0000 UTC m=+357.163573628" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.382458 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.382882 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:22 crc kubenswrapper[5094]: I0220 06:52:22.441642 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:23 crc kubenswrapper[5094]: I0220 06:52:23.081837 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 06:52:23 crc kubenswrapper[5094]: I0220 06:52:23.096055 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zw5zl" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.588979 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.589567 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.638081 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.802831 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:24 crc kubenswrapper[5094]: I0220 06:52:24.803830 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:25 crc kubenswrapper[5094]: I0220 06:52:25.096519 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 06:52:25 crc kubenswrapper[5094]: I0220 06:52:25.846854 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qcxs4" podUID="447e3a00-67d2-44c4-89cd-def383a3693d" containerName="registry-server" probeResult="failure" output=< Feb 20 06:52:25 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 06:52:25 crc kubenswrapper[5094]: > Feb 20 06:52:34 crc kubenswrapper[5094]: I0220 06:52:34.106584 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:52:34 crc kubenswrapper[5094]: I0220 06:52:34.107900 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:52:34 crc kubenswrapper[5094]: I0220 06:52:34.843215 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:34 crc kubenswrapper[5094]: I0220 06:52:34.885004 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qcxs4" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.218058 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerName="registry" containerID="cri-o://aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" gracePeriod=30 Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.710360 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778473 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778527 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778692 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778853 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778886 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.778909 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") pod \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\" (UID: \"fa6b00ff-07fb-4e9a-80da-780c22acbe69\") " Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.779834 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.781685 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.789400 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6" (OuterVolumeSpecName: "kube-api-access-76rz6") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "kube-api-access-76rz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.794045 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.794351 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.794905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.796030 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.807018 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fa6b00ff-07fb-4e9a-80da-780c22acbe69" (UID: "fa6b00ff-07fb-4e9a-80da-780c22acbe69"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.880688 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.880962 5094 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa6b00ff-07fb-4e9a-80da-780c22acbe69-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881134 5094 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa6b00ff-07fb-4e9a-80da-780c22acbe69-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881290 5094 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881436 5094 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa6b00ff-07fb-4e9a-80da-780c22acbe69-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881656 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76rz6\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-kube-api-access-76rz6\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:43 crc kubenswrapper[5094]: I0220 06:52:43.881843 5094 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa6b00ff-07fb-4e9a-80da-780c22acbe69-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191459 5094 generic.go:334] "Generic (PLEG): container finished" podID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerID="aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" exitCode=0 Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191522 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191542 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" event={"ID":"fa6b00ff-07fb-4e9a-80da-780c22acbe69","Type":"ContainerDied","Data":"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd"} Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191604 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mfphn" event={"ID":"fa6b00ff-07fb-4e9a-80da-780c22acbe69","Type":"ContainerDied","Data":"cffd4cbccec1650f91f390b126272379be0c306a55bee926972d5822622f847a"} Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.191628 5094 scope.go:117] "RemoveContainer" containerID="aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.230463 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.233731 5094 scope.go:117] "RemoveContainer" containerID="aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" Feb 20 06:52:44 crc kubenswrapper[5094]: E0220 06:52:44.234390 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd\": container with ID starting with aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd not found: ID does not exist" containerID="aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.234432 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd"} err="failed to get container status \"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd\": rpc error: code = NotFound desc = could not find container \"aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd\": container with ID starting with aea1000ba80dabd2eeaca77a83d09e249159833615723a3fcdb37fa198e8b5cd not found: ID does not exist" Feb 20 06:52:44 crc kubenswrapper[5094]: I0220 06:52:44.239961 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mfphn"] Feb 20 06:52:45 crc kubenswrapper[5094]: I0220 06:52:45.859862 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" path="/var/lib/kubelet/pods/fa6b00ff-07fb-4e9a-80da-780c22acbe69/volumes" Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.107121 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.108034 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.108138 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.109380 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.109518 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645" gracePeriod=600 Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.373007 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645" exitCode=0 Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.373464 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645"} Feb 20 06:53:04 crc kubenswrapper[5094]: I0220 06:53:04.373526 5094 scope.go:117] "RemoveContainer" containerID="85c3a0c060f20c7b6013289dd0db507f51cafe467828ae79872d174364fabd3f" Feb 20 06:53:05 crc kubenswrapper[5094]: I0220 06:53:05.385193 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852"} Feb 20 06:55:04 crc kubenswrapper[5094]: I0220 06:55:04.107807 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:55:04 crc kubenswrapper[5094]: I0220 06:55:04.108642 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:55:34 crc kubenswrapper[5094]: I0220 06:55:34.107353 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:55:34 crc kubenswrapper[5094]: I0220 06:55:34.108554 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.106414 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.107322 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.107370 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.107962 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.108048 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852" gracePeriod=600 Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.907770 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852" exitCode=0 Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.907856 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852"} Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.908199 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3"} Feb 20 06:56:04 crc kubenswrapper[5094]: I0220 06:56:04.908230 5094 scope.go:117] "RemoveContainer" containerID="c4c7f07b9c4d1d267d7fb57a087e1e994c52fa85636dd6484f657b9804207645" Feb 20 06:58:04 crc kubenswrapper[5094]: I0220 06:58:04.107560 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:58:04 crc kubenswrapper[5094]: I0220 06:58:04.109323 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:58:34 crc kubenswrapper[5094]: I0220 06:58:34.107155 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:58:34 crc kubenswrapper[5094]: I0220 06:58:34.108080 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:59:01 crc kubenswrapper[5094]: I0220 06:59:01.828622 5094 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.107154 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.107251 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.107308 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.108036 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.108107 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3" gracePeriod=600 Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.283904 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3" exitCode=0 Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.283989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3"} Feb 20 06:59:04 crc kubenswrapper[5094]: I0220 06:59:04.284467 5094 scope.go:117] "RemoveContainer" containerID="f3c668007520ff2c7deb46a6ed520de8c2bf8999084ab5e4c0bcef19c03e1852" Feb 20 06:59:05 crc kubenswrapper[5094]: I0220 06:59:05.295271 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6"} Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.376025 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:19 crc kubenswrapper[5094]: E0220 06:59:19.377372 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerName="registry" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.377396 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerName="registry" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.377613 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa6b00ff-07fb-4e9a-80da-780c22acbe69" containerName="registry" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.379139 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.402227 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.543831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.543952 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.544027 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.645984 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.646078 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.646125 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.646946 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.647055 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.677366 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") pod \"redhat-marketplace-4z8d2\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:19 crc kubenswrapper[5094]: I0220 06:59:19.760823 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.040225 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:20 crc kubenswrapper[5094]: W0220 06:59:20.054966 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod821ee7ec_9cd3_4402_b70d_1c06d52aeb22.slice/crio-5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb WatchSource:0}: Error finding container 5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb: Status 404 returned error can't find the container with id 5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.458033 5094 generic.go:334] "Generic (PLEG): container finished" podID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerID="892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2" exitCode=0 Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.458776 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerDied","Data":"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2"} Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.458836 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerStarted","Data":"5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb"} Feb 20 06:59:20 crc kubenswrapper[5094]: I0220 06:59:20.462109 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 06:59:21 crc kubenswrapper[5094]: I0220 06:59:21.469373 5094 generic.go:334] "Generic (PLEG): container finished" podID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerID="e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1" exitCode=0 Feb 20 06:59:21 crc kubenswrapper[5094]: I0220 06:59:21.469501 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerDied","Data":"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1"} Feb 20 06:59:22 crc kubenswrapper[5094]: I0220 06:59:22.479761 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerStarted","Data":"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67"} Feb 20 06:59:22 crc kubenswrapper[5094]: I0220 06:59:22.511481 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4z8d2" podStartSLOduration=2.09378517 podStartE2EDuration="3.511454592s" podCreationTimestamp="2026-02-20 06:59:19 +0000 UTC" firstStartedPulling="2026-02-20 06:59:20.461725778 +0000 UTC m=+775.334352509" lastFinishedPulling="2026-02-20 06:59:21.87939519 +0000 UTC m=+776.752021931" observedRunningTime="2026-02-20 06:59:22.503560612 +0000 UTC m=+777.376187333" watchObservedRunningTime="2026-02-20 06:59:22.511454592 +0000 UTC m=+777.384081343" Feb 20 06:59:29 crc kubenswrapper[5094]: I0220 06:59:29.762151 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:29 crc kubenswrapper[5094]: I0220 06:59:29.763225 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:29 crc kubenswrapper[5094]: I0220 06:59:29.804798 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:30 crc kubenswrapper[5094]: I0220 06:59:30.614022 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:30 crc kubenswrapper[5094]: I0220 06:59:30.686626 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.463841 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.465436 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.486336 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.554904 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4z8d2" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="registry-server" containerID="cri-o://2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" gracePeriod=2 Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.558820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.559024 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.559094 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.660029 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.660096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.660149 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.660736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.661058 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.708040 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") pod \"redhat-operators-4nkfk\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:32 crc kubenswrapper[5094]: I0220 06:59:32.798619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.000986 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.067386 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.068027 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") pod \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.068228 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") pod \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.068286 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") pod \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\" (UID: \"821ee7ec-9cd3-4402-b70d-1c06d52aeb22\") " Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.069275 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities" (OuterVolumeSpecName: "utilities") pod "821ee7ec-9cd3-4402-b70d-1c06d52aeb22" (UID: "821ee7ec-9cd3-4402-b70d-1c06d52aeb22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.075342 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6" (OuterVolumeSpecName: "kube-api-access-lq6q6") pod "821ee7ec-9cd3-4402-b70d-1c06d52aeb22" (UID: "821ee7ec-9cd3-4402-b70d-1c06d52aeb22"). InnerVolumeSpecName "kube-api-access-lq6q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.093768 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "821ee7ec-9cd3-4402-b70d-1c06d52aeb22" (UID: "821ee7ec-9cd3-4402-b70d-1c06d52aeb22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.169386 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.169428 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.169446 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq6q6\" (UniqueName: \"kubernetes.io/projected/821ee7ec-9cd3-4402-b70d-1c06d52aeb22-kube-api-access-lq6q6\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.561871 5094 generic.go:334] "Generic (PLEG): container finished" podID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerID="503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df" exitCode=0 Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.562000 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerDied","Data":"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df"} Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.562077 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerStarted","Data":"5046403aa61e0aba40eb47656898bc09f17cf4dac06fc90a524dc06a79c5bd91"} Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.564921 5094 generic.go:334] "Generic (PLEG): container finished" podID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerID="2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" exitCode=0 Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.564987 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerDied","Data":"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67"} Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.565033 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4z8d2" event={"ID":"821ee7ec-9cd3-4402-b70d-1c06d52aeb22","Type":"ContainerDied","Data":"5541eaf56506c3f5d6d1d2eb83fc2e63f98af88e2b9771e22d69f20383788ecb"} Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.565041 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4z8d2" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.565057 5094 scope.go:117] "RemoveContainer" containerID="2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.581229 5094 scope.go:117] "RemoveContainer" containerID="e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.610159 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.612118 5094 scope.go:117] "RemoveContainer" containerID="892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.623881 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4z8d2"] Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.629558 5094 scope.go:117] "RemoveContainer" containerID="2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" Feb 20 06:59:33 crc kubenswrapper[5094]: E0220 06:59:33.630171 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67\": container with ID starting with 2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67 not found: ID does not exist" containerID="2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.630224 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67"} err="failed to get container status \"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67\": rpc error: code = NotFound desc = could not find container \"2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67\": container with ID starting with 2b10876c43b8f6f70135e7079fce1a03a047a31f088519b1d834d3555fef4e67 not found: ID does not exist" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.630257 5094 scope.go:117] "RemoveContainer" containerID="e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1" Feb 20 06:59:33 crc kubenswrapper[5094]: E0220 06:59:33.630496 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1\": container with ID starting with e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1 not found: ID does not exist" containerID="e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.630526 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1"} err="failed to get container status \"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1\": rpc error: code = NotFound desc = could not find container \"e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1\": container with ID starting with e8182412f7a171ee5e0773b3af2b1e29c4e601ab3d3e8a7ac8a8c9fe09e483f1 not found: ID does not exist" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.630543 5094 scope.go:117] "RemoveContainer" containerID="892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2" Feb 20 06:59:33 crc kubenswrapper[5094]: E0220 06:59:33.631475 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2\": container with ID starting with 892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2 not found: ID does not exist" containerID="892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.631627 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2"} err="failed to get container status \"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2\": rpc error: code = NotFound desc = could not find container \"892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2\": container with ID starting with 892bd5ba5079073acf61763945511a54b1e887f88b07c6a293f330e935e0f8e2 not found: ID does not exist" Feb 20 06:59:33 crc kubenswrapper[5094]: I0220 06:59:33.849186 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" path="/var/lib/kubelet/pods/821ee7ec-9cd3-4402-b70d-1c06d52aeb22/volumes" Feb 20 06:59:34 crc kubenswrapper[5094]: I0220 06:59:34.576581 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerStarted","Data":"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4"} Feb 20 06:59:35 crc kubenswrapper[5094]: I0220 06:59:35.590674 5094 generic.go:334] "Generic (PLEG): container finished" podID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerID="76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4" exitCode=0 Feb 20 06:59:35 crc kubenswrapper[5094]: I0220 06:59:35.591828 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerDied","Data":"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4"} Feb 20 06:59:36 crc kubenswrapper[5094]: I0220 06:59:36.602137 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerStarted","Data":"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2"} Feb 20 06:59:36 crc kubenswrapper[5094]: I0220 06:59:36.636860 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4nkfk" podStartSLOduration=2.185496107 podStartE2EDuration="4.636831395s" podCreationTimestamp="2026-02-20 06:59:32 +0000 UTC" firstStartedPulling="2026-02-20 06:59:33.563757353 +0000 UTC m=+788.436384054" lastFinishedPulling="2026-02-20 06:59:36.015092621 +0000 UTC m=+790.887719342" observedRunningTime="2026-02-20 06:59:36.633109667 +0000 UTC m=+791.505736418" watchObservedRunningTime="2026-02-20 06:59:36.636831395 +0000 UTC m=+791.509458116" Feb 20 06:59:42 crc kubenswrapper[5094]: I0220 06:59:42.799462 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:42 crc kubenswrapper[5094]: I0220 06:59:42.800734 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:43 crc kubenswrapper[5094]: I0220 06:59:43.885413 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4nkfk" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" probeResult="failure" output=< Feb 20 06:59:43 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 06:59:43 crc kubenswrapper[5094]: > Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.792207 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 06:59:49 crc kubenswrapper[5094]: E0220 06:59:49.793213 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="extract-utilities" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.793242 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="extract-utilities" Feb 20 06:59:49 crc kubenswrapper[5094]: E0220 06:59:49.793296 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="registry-server" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.793310 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="registry-server" Feb 20 06:59:49 crc kubenswrapper[5094]: E0220 06:59:49.793340 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="extract-content" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.793355 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="extract-content" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.793602 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="821ee7ec-9cd3-4402-b70d-1c06d52aeb22" containerName="registry-server" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.795139 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.814391 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.849575 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.849784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.849828 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.951260 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.951582 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.951653 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.952470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.952974 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:49 crc kubenswrapper[5094]: I0220 06:59:49.985977 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") pod \"certified-operators-4gmch\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.135122 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.434456 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.708390 5094 generic.go:334] "Generic (PLEG): container finished" podID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerID="95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d" exitCode=0 Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.708790 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerDied","Data":"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d"} Feb 20 06:59:50 crc kubenswrapper[5094]: I0220 06:59:50.708917 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerStarted","Data":"1e6e6e94902ae5b12e4f706c4f350273359bd465dfb08de600352fdbf620c8db"} Feb 20 06:59:51 crc kubenswrapper[5094]: I0220 06:59:51.717186 5094 generic.go:334] "Generic (PLEG): container finished" podID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerID="6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f" exitCode=0 Feb 20 06:59:51 crc kubenswrapper[5094]: I0220 06:59:51.717309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerDied","Data":"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f"} Feb 20 06:59:52 crc kubenswrapper[5094]: I0220 06:59:52.729640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerStarted","Data":"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577"} Feb 20 06:59:52 crc kubenswrapper[5094]: I0220 06:59:52.765112 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4gmch" podStartSLOduration=2.34297934 podStartE2EDuration="3.765077918s" podCreationTimestamp="2026-02-20 06:59:49 +0000 UTC" firstStartedPulling="2026-02-20 06:59:50.71018237 +0000 UTC m=+805.582809081" lastFinishedPulling="2026-02-20 06:59:52.132280918 +0000 UTC m=+807.004907659" observedRunningTime="2026-02-20 06:59:52.758189882 +0000 UTC m=+807.630816613" watchObservedRunningTime="2026-02-20 06:59:52.765077918 +0000 UTC m=+807.637704669" Feb 20 06:59:52 crc kubenswrapper[5094]: I0220 06:59:52.875469 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:52 crc kubenswrapper[5094]: I0220 06:59:52.943028 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.176573 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.176996 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4nkfk" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" containerID="cri-o://b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" gracePeriod=2 Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.638059 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.752910 5094 generic.go:334] "Generic (PLEG): container finished" podID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerID="b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" exitCode=0 Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.752979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerDied","Data":"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2"} Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.753039 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4nkfk" event={"ID":"cea80939-e9e2-40e3-9a29-06cef37a5482","Type":"ContainerDied","Data":"5046403aa61e0aba40eb47656898bc09f17cf4dac06fc90a524dc06a79c5bd91"} Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.753038 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4nkfk" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.753074 5094 scope.go:117] "RemoveContainer" containerID="b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.776571 5094 scope.go:117] "RemoveContainer" containerID="76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.799549 5094 scope.go:117] "RemoveContainer" containerID="503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.821418 5094 scope.go:117] "RemoveContainer" containerID="b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" Feb 20 06:59:55 crc kubenswrapper[5094]: E0220 06:59:55.822319 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2\": container with ID starting with b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2 not found: ID does not exist" containerID="b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.822379 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2"} err="failed to get container status \"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2\": rpc error: code = NotFound desc = could not find container \"b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2\": container with ID starting with b112ad3d8a346efd44956e8ef786de7e633cac9e475e5e014c8ead99185fa2f2 not found: ID does not exist" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.822413 5094 scope.go:117] "RemoveContainer" containerID="76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4" Feb 20 06:59:55 crc kubenswrapper[5094]: E0220 06:59:55.823070 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4\": container with ID starting with 76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4 not found: ID does not exist" containerID="76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.823108 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4"} err="failed to get container status \"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4\": rpc error: code = NotFound desc = could not find container \"76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4\": container with ID starting with 76d64a917fb426c24c5fd09ad03b0e50da2941358dfcf209ceb4a8d5797261f4 not found: ID does not exist" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.823146 5094 scope.go:117] "RemoveContainer" containerID="503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df" Feb 20 06:59:55 crc kubenswrapper[5094]: E0220 06:59:55.824181 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df\": container with ID starting with 503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df not found: ID does not exist" containerID="503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.824241 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df"} err="failed to get container status \"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df\": rpc error: code = NotFound desc = could not find container \"503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df\": container with ID starting with 503e00c005f8a4aa0a101cf737ccd14fe94ec232147036287c455d933cd8c8df not found: ID does not exist" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.837918 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") pod \"cea80939-e9e2-40e3-9a29-06cef37a5482\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.837982 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") pod \"cea80939-e9e2-40e3-9a29-06cef37a5482\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.838072 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") pod \"cea80939-e9e2-40e3-9a29-06cef37a5482\" (UID: \"cea80939-e9e2-40e3-9a29-06cef37a5482\") " Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.840637 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities" (OuterVolumeSpecName: "utilities") pod "cea80939-e9e2-40e3-9a29-06cef37a5482" (UID: "cea80939-e9e2-40e3-9a29-06cef37a5482"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.846312 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r" (OuterVolumeSpecName: "kube-api-access-xgm4r") pod "cea80939-e9e2-40e3-9a29-06cef37a5482" (UID: "cea80939-e9e2-40e3-9a29-06cef37a5482"). InnerVolumeSpecName "kube-api-access-xgm4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.941663 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.942845 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgm4r\" (UniqueName: \"kubernetes.io/projected/cea80939-e9e2-40e3-9a29-06cef37a5482-kube-api-access-xgm4r\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:55 crc kubenswrapper[5094]: I0220 06:59:55.961204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cea80939-e9e2-40e3-9a29-06cef37a5482" (UID: "cea80939-e9e2-40e3-9a29-06cef37a5482"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 06:59:56 crc kubenswrapper[5094]: I0220 06:59:56.044568 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea80939-e9e2-40e3-9a29-06cef37a5482-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 06:59:56 crc kubenswrapper[5094]: I0220 06:59:56.090266 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:56 crc kubenswrapper[5094]: I0220 06:59:56.097365 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4nkfk"] Feb 20 06:59:57 crc kubenswrapper[5094]: I0220 06:59:57.850724 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" path="/var/lib/kubelet/pods/cea80939-e9e2-40e3-9a29-06cef37a5482/volumes" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.135638 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.135934 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.195815 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:00:00 crc kubenswrapper[5094]: E0220 07:00:00.196572 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.198032 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" Feb 20 07:00:00 crc kubenswrapper[5094]: E0220 07:00:00.198221 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="extract-content" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.198335 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="extract-content" Feb 20 07:00:00 crc kubenswrapper[5094]: E0220 07:00:00.198452 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="extract-utilities" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.198557 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="extract-utilities" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.198926 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cea80939-e9e2-40e3-9a29-06cef37a5482" containerName="registry-server" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.199795 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.203047 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.204078 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.210264 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.224266 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.311887 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.312067 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.312102 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.414068 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.414143 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.414175 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.415513 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.422471 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.438490 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") pod \"collect-profiles-29526180-74wt5\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.522210 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.769865 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.788193 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" event={"ID":"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d","Type":"ContainerStarted","Data":"b58487640e24a04a74b5b564afcd111dac042cd6580a942d0fd81dbfd354738f"} Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.853227 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:00 crc kubenswrapper[5094]: I0220 07:00:00.925740 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 07:00:01 crc kubenswrapper[5094]: I0220 07:00:01.796879 5094 generic.go:334] "Generic (PLEG): container finished" podID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" containerID="a3c7984448d7f3db690223dff864550548436ad39114dac24772a87d3288c8ea" exitCode=0 Feb 20 07:00:01 crc kubenswrapper[5094]: I0220 07:00:01.796966 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" event={"ID":"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d","Type":"ContainerDied","Data":"a3c7984448d7f3db690223dff864550548436ad39114dac24772a87d3288c8ea"} Feb 20 07:00:02 crc kubenswrapper[5094]: I0220 07:00:02.806606 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4gmch" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="registry-server" containerID="cri-o://613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" gracePeriod=2 Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.179612 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.302229 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.366768 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") pod \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.366941 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") pod \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.367186 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") pod \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\" (UID: \"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.368229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume" (OuterVolumeSpecName: "config-volume") pod "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" (UID: "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.373171 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" (UID: "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.374970 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn" (OuterVolumeSpecName: "kube-api-access-49snn") pod "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" (UID: "76c5d78f-89fe-4ed3-b64e-52de9f0dec4d"). InnerVolumeSpecName "kube-api-access-49snn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.468768 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") pod \"89113f3f-3992-4b74-af7f-4d31f0322f24\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.468968 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") pod \"89113f3f-3992-4b74-af7f-4d31f0322f24\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.469033 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") pod \"89113f3f-3992-4b74-af7f-4d31f0322f24\" (UID: \"89113f3f-3992-4b74-af7f-4d31f0322f24\") " Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.469310 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49snn\" (UniqueName: \"kubernetes.io/projected/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-kube-api-access-49snn\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.469330 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.469343 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.470155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities" (OuterVolumeSpecName: "utilities") pod "89113f3f-3992-4b74-af7f-4d31f0322f24" (UID: "89113f3f-3992-4b74-af7f-4d31f0322f24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.473064 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b" (OuterVolumeSpecName: "kube-api-access-gnn7b") pod "89113f3f-3992-4b74-af7f-4d31f0322f24" (UID: "89113f3f-3992-4b74-af7f-4d31f0322f24"). InnerVolumeSpecName "kube-api-access-gnn7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.548378 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89113f3f-3992-4b74-af7f-4d31f0322f24" (UID: "89113f3f-3992-4b74-af7f-4d31f0322f24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.586531 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.586650 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnn7b\" (UniqueName: \"kubernetes.io/projected/89113f3f-3992-4b74-af7f-4d31f0322f24-kube-api-access-gnn7b\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.586684 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89113f3f-3992-4b74-af7f-4d31f0322f24-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.814963 5094 generic.go:334] "Generic (PLEG): container finished" podID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerID="613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" exitCode=0 Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.815056 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4gmch" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.815069 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerDied","Data":"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577"} Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.815514 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4gmch" event={"ID":"89113f3f-3992-4b74-af7f-4d31f0322f24","Type":"ContainerDied","Data":"1e6e6e94902ae5b12e4f706c4f350273359bd465dfb08de600352fdbf620c8db"} Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.815538 5094 scope.go:117] "RemoveContainer" containerID="613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.817237 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" event={"ID":"76c5d78f-89fe-4ed3-b64e-52de9f0dec4d","Type":"ContainerDied","Data":"b58487640e24a04a74b5b564afcd111dac042cd6580a942d0fd81dbfd354738f"} Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.817264 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b58487640e24a04a74b5b564afcd111dac042cd6580a942d0fd81dbfd354738f" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.817320 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.838566 5094 scope.go:117] "RemoveContainer" containerID="6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.858639 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.862293 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4gmch"] Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.871639 5094 scope.go:117] "RemoveContainer" containerID="95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.888105 5094 scope.go:117] "RemoveContainer" containerID="613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" Feb 20 07:00:03 crc kubenswrapper[5094]: E0220 07:00:03.888643 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577\": container with ID starting with 613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577 not found: ID does not exist" containerID="613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.888746 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577"} err="failed to get container status \"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577\": rpc error: code = NotFound desc = could not find container \"613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577\": container with ID starting with 613bfd350078e386d2d12c32df7b2dd6b2bef25710e430584dfa155d005c6577 not found: ID does not exist" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.888793 5094 scope.go:117] "RemoveContainer" containerID="6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f" Feb 20 07:00:03 crc kubenswrapper[5094]: E0220 07:00:03.889239 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f\": container with ID starting with 6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f not found: ID does not exist" containerID="6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.889311 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f"} err="failed to get container status \"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f\": rpc error: code = NotFound desc = could not find container \"6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f\": container with ID starting with 6b02775e20ebc894175db6b08d70c4d79abe2c30b9e0122c0e58798e158df89f not found: ID does not exist" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.889370 5094 scope.go:117] "RemoveContainer" containerID="95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d" Feb 20 07:00:03 crc kubenswrapper[5094]: E0220 07:00:03.889860 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d\": container with ID starting with 95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d not found: ID does not exist" containerID="95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d" Feb 20 07:00:03 crc kubenswrapper[5094]: I0220 07:00:03.889935 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d"} err="failed to get container status \"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d\": rpc error: code = NotFound desc = could not find container \"95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d\": container with ID starting with 95943edcca5414670df824fe9a787cee8631e2e14f0043b7c6c71162088d8a5d not found: ID does not exist" Feb 20 07:00:05 crc kubenswrapper[5094]: I0220 07:00:05.847694 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" path="/var/lib/kubelet/pods/89113f3f-3992-4b74-af7f-4d31f0322f24/volumes" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.450021 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29bjc"] Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451529 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-controller" containerID="cri-o://9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451613 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451835 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="northd" containerID="cri-o://bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451897 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-acl-logging" containerID="cri-o://2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451819 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="nbdb" containerID="cri-o://835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.451936 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-node" containerID="cri-o://f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.452100 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="sbdb" containerID="cri-o://7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.558141 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" containerID="cri-o://1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" gracePeriod=30 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.841248 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.844337 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovn-acl-logging/0.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.844853 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovn-controller/0.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.845496 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.885991 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886050 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886059 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovnkube-controller/3.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886087 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886117 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886145 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886204 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886249 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886277 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886314 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886341 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886436 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886506 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886539 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886582 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886605 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886635 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886667 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") pod \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\" (UID: \"d1c36de3-d36b-48ed-9d4d-3aa52d72add0\") " Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.886882 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887395 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887440 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887818 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887899 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket" (OuterVolumeSpecName: "log-socket") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.887966 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888805 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log" (OuterVolumeSpecName: "node-log") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888871 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888909 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888941 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash" (OuterVolumeSpecName: "host-slash") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.888972 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889452 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889464 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889490 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889526 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.889585 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.894309 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovn-acl-logging/0.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.895184 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-29bjc_d1c36de3-d36b-48ed-9d4d-3aa52d72add0/ovn-controller/0.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896127 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896578 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896617 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896626 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896634 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896644 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896652 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" exitCode=0 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896660 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" exitCode=143 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896672 5094 generic.go:334] "Generic (PLEG): container finished" podID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" exitCode=143 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896766 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896791 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896806 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896818 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896828 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896840 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896856 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896868 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896874 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896880 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896885 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896890 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896896 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896901 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896906 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896912 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896920 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896927 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896933 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896938 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896943 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896949 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896954 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896960 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896966 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896971 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896987 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.896994 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897000 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897006 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897014 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897020 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897026 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897032 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897038 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897272 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897282 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" event={"ID":"d1c36de3-d36b-48ed-9d4d-3aa52d72add0","Type":"ContainerDied","Data":"7a9fcbdbce2beb3a7269a8c12a8179221e63193036cba570355ff8c9f0adb656"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897293 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897302 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897308 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897314 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897319 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897325 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897330 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897335 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897340 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897346 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897258 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.897243 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-29bjc" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.900285 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6" (OuterVolumeSpecName: "kube-api-access-swnw6") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "kube-api-access-swnw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.910670 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/2.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.912306 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.912812 5094 generic.go:334] "Generic (PLEG): container finished" podID="c3900f6d-3035-4fc4-80a2-9e79154f4f5e" containerID="0e16f340af41f0cc3bffd1d98c9695dc8ad9491384da855c5637478b18c6f793" exitCode=2 Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.912977 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerDied","Data":"0e16f340af41f0cc3bffd1d98c9695dc8ad9491384da855c5637478b18c6f793"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.913166 5094 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79"} Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.913984 5094 scope.go:117] "RemoveContainer" containerID="0e16f340af41f0cc3bffd1d98c9695dc8ad9491384da855c5637478b18c6f793" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.918775 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mfxc9"] Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919122 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919144 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919165 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919179 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919193 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="sbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919206 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="sbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919219 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919229 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919244 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="northd" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919254 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="northd" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919268 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="nbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919279 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="nbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919293 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="extract-utilities" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919303 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="extract-utilities" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919315 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919326 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919338 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919348 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919362 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" containerName="collect-profiles" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919371 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" containerName="collect-profiles" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919387 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-acl-logging" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919397 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-acl-logging" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919411 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="registry-server" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919462 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="registry-server" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919482 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-node" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919493 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-node" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919511 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919521 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919558 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kubecfg-setup" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919572 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kubecfg-setup" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.919588 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="extract-content" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919598 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="extract-content" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919781 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919801 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" containerName="collect-profiles" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919818 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919830 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="sbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919847 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-node" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919858 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="kube-rbac-proxy-ovn-metrics" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919875 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919889 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-acl-logging" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919904 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="nbdb" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919918 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="northd" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919930 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovn-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.919940 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="89113f3f-3992-4b74-af7f-4d31f0322f24" containerName="registry-server" Feb 20 07:00:10 crc kubenswrapper[5094]: E0220 07:00:10.920100 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.920114 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.920262 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.920553 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" containerName="ovnkube-controller" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.926454 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.929181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d1c36de3-d36b-48ed-9d4d-3aa52d72add0" (UID: "d1c36de3-d36b-48ed-9d4d-3aa52d72add0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.930651 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.971633 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989788 5094 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989841 5094 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989856 5094 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989933 5094 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989948 5094 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989962 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989976 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.989989 5094 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990001 5094 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990013 5094 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990025 5094 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990039 5094 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-node-log\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990051 5094 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990066 5094 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990079 5094 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-slash\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990096 5094 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990114 5094 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990130 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swnw6\" (UniqueName: \"kubernetes.io/projected/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-kube-api-access-swnw6\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990146 5094 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-log-socket\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:10 crc kubenswrapper[5094]: I0220 07:00:10.990159 5094 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d1c36de3-d36b-48ed-9d4d-3aa52d72add0-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.011548 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.045301 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.067273 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.085404 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091512 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e4a996a-7aca-4bec-b29f-084adfb08333-ovn-node-metrics-cert\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091562 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-node-log\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091596 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-netns\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091627 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-env-overrides\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.091663 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-kubelet\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092543 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-script-lib\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092632 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-slash\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092785 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-etc-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092878 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-bin\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092940 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.092991 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bw5z\" (UniqueName: \"kubernetes.io/projected/9e4a996a-7aca-4bec-b29f-084adfb08333-kube-api-access-4bw5z\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093068 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-var-lib-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093273 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-log-socket\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093366 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093467 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-netd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093526 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-ovn\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093565 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-config\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093591 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-systemd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.093622 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-systemd-units\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.109952 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.130284 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.149916 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.167792 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.168314 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.168353 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.168387 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.168729 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.168758 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} err="failed to get container status \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.168777 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.169374 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.169402 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} err="failed to get container status \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.169420 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.169691 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.169736 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} err="failed to get container status \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.169753 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.170139 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.170166 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} err="failed to get container status \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.170188 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.170599 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.170696 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} err="failed to get container status \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.170748 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.171222 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171258 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} err="failed to get container status \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171278 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.171526 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171556 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} err="failed to get container status \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171582 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.171925 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171964 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} err="failed to get container status \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.171984 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: E0220 07:00:11.172229 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.172258 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} err="failed to get container status \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.172277 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.172766 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.172797 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173110 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} err="failed to get container status \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173139 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173378 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} err="failed to get container status \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173405 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173828 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} err="failed to get container status \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.173857 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174227 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} err="failed to get container status \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174254 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174494 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} err="failed to get container status \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174521 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174826 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} err="failed to get container status \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.174854 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175286 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} err="failed to get container status \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175317 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175542 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} err="failed to get container status \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175568 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175970 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} err="failed to get container status \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.175997 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.176324 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.176354 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.176757 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} err="failed to get container status \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.176821 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177087 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} err="failed to get container status \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177116 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177522 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} err="failed to get container status \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177554 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177867 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} err="failed to get container status \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.177896 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.178141 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} err="failed to get container status \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.178167 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.178420 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} err="failed to get container status \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.178462 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179008 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} err="failed to get container status \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179039 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179303 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} err="failed to get container status \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179335 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179540 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} err="failed to get container status \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.179568 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180044 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180081 5094 scope.go:117] "RemoveContainer" containerID="0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180326 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd"} err="failed to get container status \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": rpc error: code = NotFound desc = could not find container \"0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd\": container with ID starting with 0a2c84591d200653f1ce45c8755b620dd97525c73d4af666921fe4d7c70478bd not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180352 5094 scope.go:117] "RemoveContainer" containerID="7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180560 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77"} err="failed to get container status \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": rpc error: code = NotFound desc = could not find container \"7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77\": container with ID starting with 7d93b7470b26d1e304e30af82f614c216d4b0040027226319a44ab5283ef4e77 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.180580 5094 scope.go:117] "RemoveContainer" containerID="835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181037 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9"} err="failed to get container status \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": rpc error: code = NotFound desc = could not find container \"835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9\": container with ID starting with 835bf60842bcf3cfa3f052972bb31d3a659215c274b5bab550af83e30abb91c9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181066 5094 scope.go:117] "RemoveContainer" containerID="bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181308 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e"} err="failed to get container status \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": rpc error: code = NotFound desc = could not find container \"bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e\": container with ID starting with bede2168b9a96fd1fe4a560cef1ffd40f288423a3556403719aaa1741b5add3e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181334 5094 scope.go:117] "RemoveContainer" containerID="192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181741 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf"} err="failed to get container status \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": rpc error: code = NotFound desc = could not find container \"192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf\": container with ID starting with 192880cc734f78249b25c875cd8ab012e6b495ff013ebff4a3e1e9b0f141a3cf not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.181767 5094 scope.go:117] "RemoveContainer" containerID="f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.184749 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e"} err="failed to get container status \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": rpc error: code = NotFound desc = could not find container \"f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e\": container with ID starting with f2365e8808c67921e7ce3d6f6fbf7c060b538166b09521381967613b1641135e not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.184785 5094 scope.go:117] "RemoveContainer" containerID="2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.185135 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba"} err="failed to get container status \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": rpc error: code = NotFound desc = could not find container \"2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba\": container with ID starting with 2f1062cf973e180b8ba93f007cdba548a129ae8ac5063e39a2389356990daeba not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.185202 5094 scope.go:117] "RemoveContainer" containerID="9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.185659 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60"} err="failed to get container status \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": rpc error: code = NotFound desc = could not find container \"9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60\": container with ID starting with 9b52eb573beb3cd0b4cc8aa25fabafe2281472befde76787449ee864357f6f60 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.185691 5094 scope.go:117] "RemoveContainer" containerID="218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.186061 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10"} err="failed to get container status \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": rpc error: code = NotFound desc = could not find container \"218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10\": container with ID starting with 218407d57c8deb793c89a17289e4e7305d96201902d79ba9ef57422630316f10 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.186092 5094 scope.go:117] "RemoveContainer" containerID="1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.186606 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9"} err="failed to get container status \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": rpc error: code = NotFound desc = could not find container \"1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9\": container with ID starting with 1916d532152118c180ffed064be63aaf1a10f77ed4c81df8ac1021612b74bbd9 not found: ID does not exist" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194228 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-netns\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194287 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-env-overrides\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194323 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-kubelet\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194373 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-script-lib\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194395 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-slash\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194421 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-etc-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194443 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-bin\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194469 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bw5z\" (UniqueName: \"kubernetes.io/projected/9e4a996a-7aca-4bec-b29f-084adfb08333-kube-api-access-4bw5z\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194517 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-var-lib-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194582 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-log-socket\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194612 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194644 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-netd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194670 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-ovn\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194690 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-config\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194740 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-systemd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194765 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-systemd-units\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e4a996a-7aca-4bec-b29f-084adfb08333-ovn-node-metrics-cert\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194811 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-node-log\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194928 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-node-log\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.194980 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-netns\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195373 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-kubelet\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195748 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-env-overrides\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-log-socket\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195834 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-slash\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195860 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-etc-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-bin\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.195916 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-script-lib\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196249 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-cni-netd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196320 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-ovn\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196329 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-var-lib-openvswitch\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196943 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e4a996a-7aca-4bec-b29f-084adfb08333-ovnkube-config\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.196998 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-systemd-units\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.197033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-run-systemd\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.198413 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e4a996a-7aca-4bec-b29f-084adfb08333-host-run-ovn-kubernetes\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.203406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e4a996a-7aca-4bec-b29f-084adfb08333-ovn-node-metrics-cert\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.214550 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bw5z\" (UniqueName: \"kubernetes.io/projected/9e4a996a-7aca-4bec-b29f-084adfb08333-kube-api-access-4bw5z\") pod \"ovnkube-node-mfxc9\" (UID: \"9e4a996a-7aca-4bec-b29f-084adfb08333\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.240218 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29bjc"] Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.245088 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-29bjc"] Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.256672 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:11 crc kubenswrapper[5094]: W0220 07:00:11.288334 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e4a996a_7aca_4bec_b29f_084adfb08333.slice/crio-37906bdec6775dcd5daa77687930c110bd1dcd9a4dee4df0b7ddf1b22b11d361 WatchSource:0}: Error finding container 37906bdec6775dcd5daa77687930c110bd1dcd9a4dee4df0b7ddf1b22b11d361: Status 404 returned error can't find the container with id 37906bdec6775dcd5daa77687930c110bd1dcd9a4dee4df0b7ddf1b22b11d361 Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.851090 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1c36de3-d36b-48ed-9d4d-3aa52d72add0" path="/var/lib/kubelet/pods/d1c36de3-d36b-48ed-9d4d-3aa52d72add0/volumes" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.927310 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/2.log" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.928369 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/1.log" Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.928488 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zr8rz" event={"ID":"c3900f6d-3035-4fc4-80a2-9e79154f4f5e","Type":"ContainerStarted","Data":"1d279e83ea8ce219bda95b62bfc1a070141835ac04e351b972e1f436008c1683"} Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.932471 5094 generic.go:334] "Generic (PLEG): container finished" podID="9e4a996a-7aca-4bec-b29f-084adfb08333" containerID="f77448e24adb1d4db64bba7460ba8b08e7bf9afb7b7b9e95a62bbc45afd95581" exitCode=0 Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.932524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerDied","Data":"f77448e24adb1d4db64bba7460ba8b08e7bf9afb7b7b9e95a62bbc45afd95581"} Feb 20 07:00:11 crc kubenswrapper[5094]: I0220 07:00:11.932545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"37906bdec6775dcd5daa77687930c110bd1dcd9a4dee4df0b7ddf1b22b11d361"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.944651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"04922c8d1356f0213893ad03eb2eb98102ac071346518cf74a777806c4766952"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.945389 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"8f8e6cc3d30ef9904866577122b5f3fb523207ceea1a1103df1fb9c7f084d947"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.945406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"91b200f21b2028e01f125b98f346d32989f273ef639b1a40fa04e342f6361bc1"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.945419 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"e65dcc683e28168ee2a8e8e100fff4170469a417a2153f8103028dbe2708a161"} Feb 20 07:00:12 crc kubenswrapper[5094]: I0220 07:00:12.945432 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"88c024bcff9b37a832bdfebb468494f719fe1848779e758af6cda6462f947610"} Feb 20 07:00:13 crc kubenswrapper[5094]: I0220 07:00:13.960627 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"c6acd2b6f417927a757a926676c9d09cddd0ab6c3955612a17294de6e0e22c20"} Feb 20 07:00:15 crc kubenswrapper[5094]: I0220 07:00:15.982507 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"26c55ff0bec4de361a2c42c2f62fd56bb505ae7e61509c1c8b001cec3dd7c816"} Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.004129 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" event={"ID":"9e4a996a-7aca-4bec-b29f-084adfb08333","Type":"ContainerStarted","Data":"339f90966a3a5116c39b9597ba0abbb3d3664f9fb05a6b1ef881eb423112fec4"} Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.004558 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.004587 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.004597 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.042427 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" podStartSLOduration=8.042406697 podStartE2EDuration="8.042406697s" podCreationTimestamp="2026-02-20 07:00:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:00:18.041374312 +0000 UTC m=+832.914001063" watchObservedRunningTime="2026-02-20 07:00:18.042406697 +0000 UTC m=+832.915033418" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.050932 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:18 crc kubenswrapper[5094]: I0220 07:00:18.053139 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:19 crc kubenswrapper[5094]: I0220 07:00:19.996806 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 07:00:19 crc kubenswrapper[5094]: I0220 07:00:19.998471 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.001063 5094 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-c5nt4" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.003391 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.003399 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.003864 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.010969 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.039130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.039499 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.039800 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.140924 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.141423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.141474 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.141666 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.142508 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.173407 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") pod \"crc-storage-crc-48gmd\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: I0220 07:00:20.328282 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: E0220 07:00:20.369288 5094 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(e239c521a0661f805585d75a5caeaeeed17de22e6929841ac5ef805c84e4d0e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 07:00:20 crc kubenswrapper[5094]: E0220 07:00:20.369377 5094 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(e239c521a0661f805585d75a5caeaeeed17de22e6929841ac5ef805c84e4d0e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: E0220 07:00:20.369405 5094 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(e239c521a0661f805585d75a5caeaeeed17de22e6929841ac5ef805c84e4d0e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:20 crc kubenswrapper[5094]: E0220 07:00:20.369464 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-48gmd_crc-storage(9d8b4842-acdc-4e60-9de5-b7b6dde61b62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-48gmd_crc-storage(9d8b4842-acdc-4e60-9de5-b7b6dde61b62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(e239c521a0661f805585d75a5caeaeeed17de22e6929841ac5ef805c84e4d0e9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-48gmd" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" Feb 20 07:00:21 crc kubenswrapper[5094]: I0220 07:00:21.024030 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:21 crc kubenswrapper[5094]: I0220 07:00:21.024871 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:21 crc kubenswrapper[5094]: E0220 07:00:21.067817 5094 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(5765095bf77cfaa9ecc1ed25ec88f0d1ce4f72351ae9ad105fcc1602ff307f5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 20 07:00:21 crc kubenswrapper[5094]: E0220 07:00:21.067924 5094 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(5765095bf77cfaa9ecc1ed25ec88f0d1ce4f72351ae9ad105fcc1602ff307f5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:21 crc kubenswrapper[5094]: E0220 07:00:21.067967 5094 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(5765095bf77cfaa9ecc1ed25ec88f0d1ce4f72351ae9ad105fcc1602ff307f5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:21 crc kubenswrapper[5094]: E0220 07:00:21.068053 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-48gmd_crc-storage(9d8b4842-acdc-4e60-9de5-b7b6dde61b62)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-48gmd_crc-storage(9d8b4842-acdc-4e60-9de5-b7b6dde61b62)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-48gmd_crc-storage_9d8b4842-acdc-4e60-9de5-b7b6dde61b62_0(5765095bf77cfaa9ecc1ed25ec88f0d1ce4f72351ae9ad105fcc1602ff307f5e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-48gmd" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" Feb 20 07:00:26 crc kubenswrapper[5094]: I0220 07:00:26.182525 5094 scope.go:117] "RemoveContainer" containerID="aba2be0e4774b30500be76b546e3ffff5c136a2e26675822931a400ca3090e79" Feb 20 07:00:27 crc kubenswrapper[5094]: I0220 07:00:27.079093 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zr8rz_c3900f6d-3035-4fc4-80a2-9e79154f4f5e/kube-multus/2.log" Feb 20 07:00:35 crc kubenswrapper[5094]: I0220 07:00:35.844290 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:35 crc kubenswrapper[5094]: I0220 07:00:35.847165 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:36 crc kubenswrapper[5094]: I0220 07:00:36.140986 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 07:00:37 crc kubenswrapper[5094]: I0220 07:00:37.156269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-48gmd" event={"ID":"9d8b4842-acdc-4e60-9de5-b7b6dde61b62","Type":"ContainerStarted","Data":"667a553c1273f6b25d4e22a567d26c154e982736d7b65b462fc6b3be733c0965"} Feb 20 07:00:38 crc kubenswrapper[5094]: I0220 07:00:38.168183 5094 generic.go:334] "Generic (PLEG): container finished" podID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" containerID="07fcab491ccca10a02c6e686a0115bd8c0916121144d5fd12b7356bb88847cbf" exitCode=0 Feb 20 07:00:38 crc kubenswrapper[5094]: I0220 07:00:38.168282 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-48gmd" event={"ID":"9d8b4842-acdc-4e60-9de5-b7b6dde61b62","Type":"ContainerDied","Data":"07fcab491ccca10a02c6e686a0115bd8c0916121144d5fd12b7356bb88847cbf"} Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.460518 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.475010 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") pod \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.475077 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") pod \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.475097 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") pod \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\" (UID: \"9d8b4842-acdc-4e60-9de5-b7b6dde61b62\") " Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.475298 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "9d8b4842-acdc-4e60-9de5-b7b6dde61b62" (UID: "9d8b4842-acdc-4e60-9de5-b7b6dde61b62"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.486505 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk" (OuterVolumeSpecName: "kube-api-access-wlmfk") pod "9d8b4842-acdc-4e60-9de5-b7b6dde61b62" (UID: "9d8b4842-acdc-4e60-9de5-b7b6dde61b62"). InnerVolumeSpecName "kube-api-access-wlmfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.493168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "9d8b4842-acdc-4e60-9de5-b7b6dde61b62" (UID: "9d8b4842-acdc-4e60-9de5-b7b6dde61b62"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.577003 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlmfk\" (UniqueName: \"kubernetes.io/projected/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-kube-api-access-wlmfk\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.577065 5094 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:39 crc kubenswrapper[5094]: I0220 07:00:39.577086 5094 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/9d8b4842-acdc-4e60-9de5-b7b6dde61b62-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:40 crc kubenswrapper[5094]: I0220 07:00:40.183844 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-48gmd" event={"ID":"9d8b4842-acdc-4e60-9de5-b7b6dde61b62","Type":"ContainerDied","Data":"667a553c1273f6b25d4e22a567d26c154e982736d7b65b462fc6b3be733c0965"} Feb 20 07:00:40 crc kubenswrapper[5094]: I0220 07:00:40.183978 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="667a553c1273f6b25d4e22a567d26c154e982736d7b65b462fc6b3be733c0965" Feb 20 07:00:40 crc kubenswrapper[5094]: I0220 07:00:40.184025 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-48gmd" Feb 20 07:00:41 crc kubenswrapper[5094]: I0220 07:00:41.296807 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfxc9" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.729844 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr"] Feb 20 07:00:47 crc kubenswrapper[5094]: E0220 07:00:47.730473 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" containerName="storage" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.730491 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" containerName="storage" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.730613 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" containerName="storage" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.731630 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.735121 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.744080 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr"] Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.902180 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.903108 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:47 crc kubenswrapper[5094]: I0220 07:00:47.904268 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.005786 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.006357 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.006433 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.007199 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.007472 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.038654 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.049650 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:48 crc kubenswrapper[5094]: I0220 07:00:48.307196 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr"] Feb 20 07:00:49 crc kubenswrapper[5094]: I0220 07:00:49.266918 5094 generic.go:334] "Generic (PLEG): container finished" podID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerID="f8c8b136d680dadbbed473d708411afa42e083a811c03a3a1b033f0d8631afea" exitCode=0 Feb 20 07:00:49 crc kubenswrapper[5094]: I0220 07:00:49.266994 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerDied","Data":"f8c8b136d680dadbbed473d708411afa42e083a811c03a3a1b033f0d8631afea"} Feb 20 07:00:49 crc kubenswrapper[5094]: I0220 07:00:49.267384 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerStarted","Data":"adc4324be402413f71eed86574b9f5335193dda4aaef65f248bc678eef38fdbf"} Feb 20 07:00:51 crc kubenswrapper[5094]: I0220 07:00:51.284228 5094 generic.go:334] "Generic (PLEG): container finished" podID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerID="e747dcf9fca7b80536c627106751461820e39ec33525c96f1c60053606c68ad6" exitCode=0 Feb 20 07:00:51 crc kubenswrapper[5094]: I0220 07:00:51.284989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerDied","Data":"e747dcf9fca7b80536c627106751461820e39ec33525c96f1c60053606c68ad6"} Feb 20 07:00:52 crc kubenswrapper[5094]: I0220 07:00:52.297917 5094 generic.go:334] "Generic (PLEG): container finished" podID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerID="97f353e9e01dbc2588d65d9823ad2fa367bc88377f879c7e8bb262dd866c71e7" exitCode=0 Feb 20 07:00:52 crc kubenswrapper[5094]: I0220 07:00:52.298015 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerDied","Data":"97f353e9e01dbc2588d65d9823ad2fa367bc88377f879c7e8bb262dd866c71e7"} Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.617749 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.814992 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") pod \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.815375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") pod \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.815439 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") pod \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\" (UID: \"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5\") " Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.817300 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle" (OuterVolumeSpecName: "bundle") pod "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" (UID: "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.827235 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn" (OuterVolumeSpecName: "kube-api-access-ppxmn") pod "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" (UID: "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5"). InnerVolumeSpecName "kube-api-access-ppxmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.917312 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppxmn\" (UniqueName: \"kubernetes.io/projected/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-kube-api-access-ppxmn\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:53 crc kubenswrapper[5094]: I0220 07:00:53.917423 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.041432 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util" (OuterVolumeSpecName: "util") pod "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" (UID: "b191dcbc-7e3b-4a36-a913-1fe0f53c83d5"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.120641 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b191dcbc-7e3b-4a36-a913-1fe0f53c83d5-util\") on node \"crc\" DevicePath \"\"" Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.316916 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" event={"ID":"b191dcbc-7e3b-4a36-a913-1fe0f53c83d5","Type":"ContainerDied","Data":"adc4324be402413f71eed86574b9f5335193dda4aaef65f248bc678eef38fdbf"} Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.316993 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adc4324be402413f71eed86574b9f5335193dda4aaef65f248bc678eef38fdbf" Feb 20 07:00:54 crc kubenswrapper[5094]: I0220 07:00:54.317013 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.330383 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qg9ms"] Feb 20 07:00:59 crc kubenswrapper[5094]: E0220 07:00:59.331071 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="extract" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331088 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="extract" Feb 20 07:00:59 crc kubenswrapper[5094]: E0220 07:00:59.331107 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="util" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331115 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="util" Feb 20 07:00:59 crc kubenswrapper[5094]: E0220 07:00:59.331132 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="pull" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331141 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="pull" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331257 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b191dcbc-7e3b-4a36-a913-1fe0f53c83d5" containerName="extract" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.331759 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.335686 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.337427 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.337625 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-5hz2d" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.358375 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qg9ms"] Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.506043 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqzkm\" (UniqueName: \"kubernetes.io/projected/6804a7c3-a0d7-46d4-b317-e9c54265841e-kube-api-access-dqzkm\") pod \"nmstate-operator-694c9596b7-qg9ms\" (UID: \"6804a7c3-a0d7-46d4-b317-e9c54265841e\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.607953 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqzkm\" (UniqueName: \"kubernetes.io/projected/6804a7c3-a0d7-46d4-b317-e9c54265841e-kube-api-access-dqzkm\") pod \"nmstate-operator-694c9596b7-qg9ms\" (UID: \"6804a7c3-a0d7-46d4-b317-e9c54265841e\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.643203 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqzkm\" (UniqueName: \"kubernetes.io/projected/6804a7c3-a0d7-46d4-b317-e9c54265841e-kube-api-access-dqzkm\") pod \"nmstate-operator-694c9596b7-qg9ms\" (UID: \"6804a7c3-a0d7-46d4-b317-e9c54265841e\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.700226 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" Feb 20 07:00:59 crc kubenswrapper[5094]: I0220 07:00:59.933332 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-qg9ms"] Feb 20 07:01:00 crc kubenswrapper[5094]: I0220 07:01:00.363548 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" event={"ID":"6804a7c3-a0d7-46d4-b317-e9c54265841e","Type":"ContainerStarted","Data":"eab1a11034ff39ef5fda44571267c1e6c0ac93b83ad9c282a61ab3324839e8e3"} Feb 20 07:01:02 crc kubenswrapper[5094]: I0220 07:01:02.376840 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" event={"ID":"6804a7c3-a0d7-46d4-b317-e9c54265841e","Type":"ContainerStarted","Data":"740b9e396fff92319abe6b0800e4bcd00c92c4eeb0297b46d6d41e0d1337ea99"} Feb 20 07:01:02 crc kubenswrapper[5094]: I0220 07:01:02.400633 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-qg9ms" podStartSLOduration=1.310232204 podStartE2EDuration="3.400612127s" podCreationTimestamp="2026-02-20 07:00:59 +0000 UTC" firstStartedPulling="2026-02-20 07:00:59.97056232 +0000 UTC m=+874.843189061" lastFinishedPulling="2026-02-20 07:01:02.060942273 +0000 UTC m=+876.933568984" observedRunningTime="2026-02-20 07:01:02.396162401 +0000 UTC m=+877.268789112" watchObservedRunningTime="2026-02-20 07:01:02.400612127 +0000 UTC m=+877.273238838" Feb 20 07:01:04 crc kubenswrapper[5094]: I0220 07:01:04.108041 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:01:04 crc kubenswrapper[5094]: I0220 07:01:04.108178 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.817115 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.818522 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.821490 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-s8wfg" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.846592 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.847417 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.848332 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.849771 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.859455 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.877645 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jr284"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.879658 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.926426 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnzp8\" (UniqueName: \"kubernetes.io/projected/93238aee-86f0-497a-8880-531338e8245f-kube-api-access-tnzp8\") pod \"nmstate-metrics-58c85c668d-gvgqm\" (UID: \"93238aee-86f0-497a-8880-531338e8245f\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.982339 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm"] Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.988924 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:07 crc kubenswrapper[5094]: I0220 07:01:07.991595 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-q8vqp" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.002021 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.006006 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.014612 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.031856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edd001fc-3ddc-4010-8a98-54f4ffeaba72-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.031937 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-ovs-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.031962 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/df45fab4-d183-4702-b5b6-2a4e559eff22-kube-api-access-kf4mx\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.031981 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp2f5\" (UniqueName: \"kubernetes.io/projected/edd001fc-3ddc-4010-8a98-54f4ffeaba72-kube-api-access-dp2f5\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032002 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/df45fab4-d183-4702-b5b6-2a4e559eff22-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032020 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-dbus-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032047 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnzp8\" (UniqueName: \"kubernetes.io/projected/93238aee-86f0-497a-8880-531338e8245f-kube-api-access-tnzp8\") pod \"nmstate-metrics-58c85c668d-gvgqm\" (UID: \"93238aee-86f0-497a-8880-531338e8245f\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032087 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffcc6\" (UniqueName: \"kubernetes.io/projected/55b1a421-7ec5-4442-b4c5-11767715cc4b-kube-api-access-ffcc6\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.032108 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-nmstate-lock\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.061209 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnzp8\" (UniqueName: \"kubernetes.io/projected/93238aee-86f0-497a-8880-531338e8245f-kube-api-access-tnzp8\") pod \"nmstate-metrics-58c85c668d-gvgqm\" (UID: \"93238aee-86f0-497a-8880-531338e8245f\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edd001fc-3ddc-4010-8a98-54f4ffeaba72-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133397 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-ovs-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133425 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/df45fab4-d183-4702-b5b6-2a4e559eff22-kube-api-access-kf4mx\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133447 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp2f5\" (UniqueName: \"kubernetes.io/projected/edd001fc-3ddc-4010-8a98-54f4ffeaba72-kube-api-access-dp2f5\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133471 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/df45fab4-d183-4702-b5b6-2a4e559eff22-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133490 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-dbus-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133518 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133525 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-ovs-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133537 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffcc6\" (UniqueName: \"kubernetes.io/projected/55b1a421-7ec5-4442-b4c5-11767715cc4b-kube-api-access-ffcc6\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133635 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-nmstate-lock\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: E0220 07:01:08.133795 5094 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-nmstate-lock\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: E0220 07:01:08.133917 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert podName:edd001fc-3ddc-4010-8a98-54f4ffeaba72 nodeName:}" failed. No retries permitted until 2026-02-20 07:01:08.633887364 +0000 UTC m=+883.506514075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-74czm" (UID: "edd001fc-3ddc-4010-8a98-54f4ffeaba72") : secret "plugin-serving-cert" not found Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.133945 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/55b1a421-7ec5-4442-b4c5-11767715cc4b-dbus-socket\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.135658 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/edd001fc-3ddc-4010-8a98-54f4ffeaba72-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.140301 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.148901 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/df45fab4-d183-4702-b5b6-2a4e559eff22-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.157718 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffcc6\" (UniqueName: \"kubernetes.io/projected/55b1a421-7ec5-4442-b4c5-11767715cc4b-kube-api-access-ffcc6\") pod \"nmstate-handler-jr284\" (UID: \"55b1a421-7ec5-4442-b4c5-11767715cc4b\") " pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.161223 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp2f5\" (UniqueName: \"kubernetes.io/projected/edd001fc-3ddc-4010-8a98-54f4ffeaba72-kube-api-access-dp2f5\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.167946 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4mx\" (UniqueName: \"kubernetes.io/projected/df45fab4-d183-4702-b5b6-2a4e559eff22-kube-api-access-kf4mx\") pod \"nmstate-webhook-866bcb46dc-frvdg\" (UID: \"df45fab4-d183-4702-b5b6-2a4e559eff22\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.173138 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.197286 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-576d49774f-hnd7r"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.198205 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.203735 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.221083 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576d49774f-hnd7r"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238100 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-trusted-ca-bundle\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238195 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238216 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksqgt\" (UniqueName: \"kubernetes.io/projected/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-kube-api-access-ksqgt\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238292 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-oauth-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238373 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238404 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-service-ca\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.238469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-oauth-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.339791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-trusted-ca-bundle\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340287 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340318 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqgt\" (UniqueName: \"kubernetes.io/projected/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-kube-api-access-ksqgt\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340338 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-oauth-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340380 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340406 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-service-ca\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340427 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-oauth-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.340992 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-trusted-ca-bundle\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.341861 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-service-ca\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.342181 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.342543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-oauth-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.350241 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-serving-cert\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.353101 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-console-oauth-config\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.359070 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksqgt\" (UniqueName: \"kubernetes.io/projected/731a7ef0-b1ea-4ad3-96d4-8668ebbe871b-kube-api-access-ksqgt\") pod \"console-576d49774f-hnd7r\" (UID: \"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b\") " pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.413816 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.559575 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.615573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jr284" event={"ID":"55b1a421-7ec5-4442-b4c5-11767715cc4b","Type":"ContainerStarted","Data":"c6a4943977a48d5175b7fd4b510b3e380a8721941eafb11c1cb04bd173f5b2f9"} Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.620330 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" event={"ID":"93238aee-86f0-497a-8880-531338e8245f","Type":"ContainerStarted","Data":"7bf3c0518fdab7634cd49d528214cd2710cb999119a452fee819345c75715e60"} Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.645512 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.657424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/edd001fc-3ddc-4010-8a98-54f4ffeaba72-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-74czm\" (UID: \"edd001fc-3ddc-4010-8a98-54f4ffeaba72\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.698012 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg"] Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.817798 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-576d49774f-hnd7r"] Feb 20 07:01:08 crc kubenswrapper[5094]: W0220 07:01:08.831301 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod731a7ef0_b1ea_4ad3_96d4_8668ebbe871b.slice/crio-9d637a988014941be6193cc0048507bc75538a885ca68a3dd469ac5eaace1209 WatchSource:0}: Error finding container 9d637a988014941be6193cc0048507bc75538a885ca68a3dd469ac5eaace1209: Status 404 returned error can't find the container with id 9d637a988014941be6193cc0048507bc75538a885ca68a3dd469ac5eaace1209 Feb 20 07:01:08 crc kubenswrapper[5094]: I0220 07:01:08.901960 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.208415 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm"] Feb 20 07:01:09 crc kubenswrapper[5094]: W0220 07:01:09.214971 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd001fc_3ddc_4010_8a98_54f4ffeaba72.slice/crio-da350b6a21209ea8c6d1ed74a94780433633c7d9bb0d268b94c0dc1fd5e3880d WatchSource:0}: Error finding container da350b6a21209ea8c6d1ed74a94780433633c7d9bb0d268b94c0dc1fd5e3880d: Status 404 returned error can't find the container with id da350b6a21209ea8c6d1ed74a94780433633c7d9bb0d268b94c0dc1fd5e3880d Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.629599 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" event={"ID":"df45fab4-d183-4702-b5b6-2a4e559eff22","Type":"ContainerStarted","Data":"0a97949b43680f0ac65c2584124f8331bb477d979486e053779dd6a56de3a9e3"} Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.630812 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" event={"ID":"edd001fc-3ddc-4010-8a98-54f4ffeaba72","Type":"ContainerStarted","Data":"da350b6a21209ea8c6d1ed74a94780433633c7d9bb0d268b94c0dc1fd5e3880d"} Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.632452 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576d49774f-hnd7r" event={"ID":"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b","Type":"ContainerStarted","Data":"570a069da75de306ed661786db7b4736fffbcee8209d9f5c2a6bc25200d0c644"} Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.632480 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-576d49774f-hnd7r" event={"ID":"731a7ef0-b1ea-4ad3-96d4-8668ebbe871b","Type":"ContainerStarted","Data":"9d637a988014941be6193cc0048507bc75538a885ca68a3dd469ac5eaace1209"} Feb 20 07:01:09 crc kubenswrapper[5094]: I0220 07:01:09.658012 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-576d49774f-hnd7r" podStartSLOduration=1.6579836860000001 podStartE2EDuration="1.657983686s" podCreationTimestamp="2026-02-20 07:01:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:01:09.652160105 +0000 UTC m=+884.524786826" watchObservedRunningTime="2026-02-20 07:01:09.657983686 +0000 UTC m=+884.530610397" Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.658671 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jr284" event={"ID":"55b1a421-7ec5-4442-b4c5-11767715cc4b","Type":"ContainerStarted","Data":"dd1809a65bcabb6d4cf68c2bc01cfea4e44d1732f241daaf2a9030621bf66bc2"} Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.659492 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.661536 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" event={"ID":"df45fab4-d183-4702-b5b6-2a4e559eff22","Type":"ContainerStarted","Data":"1ce204f47b0796a3ad7bbd6d3057ed5ba4b01cef0dbb4c2c09a67125c0a9f31f"} Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.661672 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.663901 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" event={"ID":"93238aee-86f0-497a-8880-531338e8245f","Type":"ContainerStarted","Data":"336ae33ea64168293ed52b20fad9e99a463a68f0f5079ea299128fa36e633b69"} Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.686693 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jr284" podStartSLOduration=2.160956592 podStartE2EDuration="4.68665875s" podCreationTimestamp="2026-02-20 07:01:07 +0000 UTC" firstStartedPulling="2026-02-20 07:01:08.248904869 +0000 UTC m=+883.121531580" lastFinishedPulling="2026-02-20 07:01:10.774606987 +0000 UTC m=+885.647233738" observedRunningTime="2026-02-20 07:01:11.67618634 +0000 UTC m=+886.548813081" watchObservedRunningTime="2026-02-20 07:01:11.68665875 +0000 UTC m=+886.559285491" Feb 20 07:01:11 crc kubenswrapper[5094]: I0220 07:01:11.696832 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" podStartSLOduration=2.650744791 podStartE2EDuration="4.696807124s" podCreationTimestamp="2026-02-20 07:01:07 +0000 UTC" firstStartedPulling="2026-02-20 07:01:08.711335383 +0000 UTC m=+883.583962094" lastFinishedPulling="2026-02-20 07:01:10.757397716 +0000 UTC m=+885.630024427" observedRunningTime="2026-02-20 07:01:11.695209026 +0000 UTC m=+886.567835737" watchObservedRunningTime="2026-02-20 07:01:11.696807124 +0000 UTC m=+886.569433825" Feb 20 07:01:12 crc kubenswrapper[5094]: I0220 07:01:12.675094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" event={"ID":"edd001fc-3ddc-4010-8a98-54f4ffeaba72","Type":"ContainerStarted","Data":"a83266cf9701e5d698bf6dcbd7f4d1e65dea29c5fc42d18ddab48ff95ae8275c"} Feb 20 07:01:12 crc kubenswrapper[5094]: I0220 07:01:12.700964 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-74czm" podStartSLOduration=2.869391759 podStartE2EDuration="5.700462951s" podCreationTimestamp="2026-02-20 07:01:07 +0000 UTC" firstStartedPulling="2026-02-20 07:01:09.217297891 +0000 UTC m=+884.089924602" lastFinishedPulling="2026-02-20 07:01:12.048369083 +0000 UTC m=+886.920995794" observedRunningTime="2026-02-20 07:01:12.693213596 +0000 UTC m=+887.565840317" watchObservedRunningTime="2026-02-20 07:01:12.700462951 +0000 UTC m=+887.573089662" Feb 20 07:01:13 crc kubenswrapper[5094]: I0220 07:01:13.688159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" event={"ID":"93238aee-86f0-497a-8880-531338e8245f","Type":"ContainerStarted","Data":"2edca5c4f7a834ac8a12a5d615a5f1d7706444ebe10163410c24ccf2fa989edf"} Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.234553 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jr284" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.260349 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-gvgqm" podStartSLOduration=6.557505205 podStartE2EDuration="11.260317444s" podCreationTimestamp="2026-02-20 07:01:07 +0000 UTC" firstStartedPulling="2026-02-20 07:01:08.432419554 +0000 UTC m=+883.305046265" lastFinishedPulling="2026-02-20 07:01:13.135231803 +0000 UTC m=+888.007858504" observedRunningTime="2026-02-20 07:01:13.724993667 +0000 UTC m=+888.597620388" watchObservedRunningTime="2026-02-20 07:01:18.260317444 +0000 UTC m=+893.132944185" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.559837 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.559914 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.581611 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.737162 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-576d49774f-hnd7r" Feb 20 07:01:18 crc kubenswrapper[5094]: I0220 07:01:18.820203 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 07:01:28 crc kubenswrapper[5094]: I0220 07:01:28.183648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-frvdg" Feb 20 07:01:34 crc kubenswrapper[5094]: I0220 07:01:34.106538 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:01:34 crc kubenswrapper[5094]: I0220 07:01:34.107424 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:01:43 crc kubenswrapper[5094]: I0220 07:01:43.891013 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-shq4j" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" containerID="cri-o://d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" gracePeriod=15 Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.270477 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-shq4j_e130287f-996d-4ab0-8c12-351bf8d21df5/console/0.log" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.270950 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382071 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382158 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382235 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382332 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382354 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.382374 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") pod \"e130287f-996d-4ab0-8c12-351bf8d21df5\" (UID: \"e130287f-996d-4ab0-8c12-351bf8d21df5\") " Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.383318 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.383407 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.383514 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config" (OuterVolumeSpecName: "console-config") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.384073 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca" (OuterVolumeSpecName: "service-ca") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.391855 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.392520 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn" (OuterVolumeSpecName: "kube-api-access-x8jnn") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "kube-api-access-x8jnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.392575 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e130287f-996d-4ab0-8c12-351bf8d21df5" (UID: "e130287f-996d-4ab0-8c12-351bf8d21df5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483620 5094 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483657 5094 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e130287f-996d-4ab0-8c12-351bf8d21df5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483672 5094 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483686 5094 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483697 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8jnn\" (UniqueName: \"kubernetes.io/projected/e130287f-996d-4ab0-8c12-351bf8d21df5-kube-api-access-x8jnn\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483732 5094 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.483745 5094 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e130287f-996d-4ab0-8c12-351bf8d21df5-console-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954611 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-shq4j_e130287f-996d-4ab0-8c12-351bf8d21df5/console/0.log" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954682 5094 generic.go:334] "Generic (PLEG): container finished" podID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerID="d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" exitCode=2 Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954742 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shq4j" event={"ID":"e130287f-996d-4ab0-8c12-351bf8d21df5","Type":"ContainerDied","Data":"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a"} Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954787 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-shq4j" event={"ID":"e130287f-996d-4ab0-8c12-351bf8d21df5","Type":"ContainerDied","Data":"9e9ebe837e9f43c76e6f912fde9f9a4d76af0096fe554a67909ac3cf138a323a"} Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954809 5094 scope.go:117] "RemoveContainer" containerID="d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.954825 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-shq4j" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.980875 5094 scope.go:117] "RemoveContainer" containerID="d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" Feb 20 07:01:44 crc kubenswrapper[5094]: E0220 07:01:44.982242 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a\": container with ID starting with d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a not found: ID does not exist" containerID="d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.982297 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a"} err="failed to get container status \"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a\": rpc error: code = NotFound desc = could not find container \"d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a\": container with ID starting with d6b6a4d5efc598b2291d4820a26080dd11c2b20b96ec9c20aea95f5a7b87fc1a not found: ID does not exist" Feb 20 07:01:44 crc kubenswrapper[5094]: I0220 07:01:44.997809 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.004383 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-shq4j"] Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.559395 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx"] Feb 20 07:01:45 crc kubenswrapper[5094]: E0220 07:01:45.559955 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.560003 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.560261 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" containerName="console" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.562108 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.565599 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.578357 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx"] Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.702641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.702747 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.702837 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.804082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.804266 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.804312 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.805327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.805379 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.837826 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.869595 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e130287f-996d-4ab0-8c12-351bf8d21df5" path="/var/lib/kubelet/pods/e130287f-996d-4ab0-8c12-351bf8d21df5/volumes" Feb 20 07:01:45 crc kubenswrapper[5094]: I0220 07:01:45.879162 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:46 crc kubenswrapper[5094]: I0220 07:01:46.399780 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx"] Feb 20 07:01:46 crc kubenswrapper[5094]: I0220 07:01:46.980558 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerID="de973fce7e21a1784784f7a7a78d34e6b9fa55160cbf7d0ce0c93ae083f6be75" exitCode=0 Feb 20 07:01:46 crc kubenswrapper[5094]: I0220 07:01:46.980633 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerDied","Data":"de973fce7e21a1784784f7a7a78d34e6b9fa55160cbf7d0ce0c93ae083f6be75"} Feb 20 07:01:46 crc kubenswrapper[5094]: I0220 07:01:46.980678 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerStarted","Data":"6a7724051ddaaa78a6755cc4cbefdaa122af49429cca7f0feaad15fbed9ac2e9"} Feb 20 07:01:49 crc kubenswrapper[5094]: I0220 07:01:49.002329 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerID="47051f79c300c7ff20786f74ddda68a16f5e36cd8b898ed81cf7f5aceabdc46b" exitCode=0 Feb 20 07:01:49 crc kubenswrapper[5094]: I0220 07:01:49.002496 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerDied","Data":"47051f79c300c7ff20786f74ddda68a16f5e36cd8b898ed81cf7f5aceabdc46b"} Feb 20 07:01:50 crc kubenswrapper[5094]: I0220 07:01:50.013227 5094 generic.go:334] "Generic (PLEG): container finished" podID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerID="86d72c9f6066ce2d15d3e5f4133f7e9f32b85c81cea4d75c03ff8fef2121e66b" exitCode=0 Feb 20 07:01:50 crc kubenswrapper[5094]: I0220 07:01:50.013286 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerDied","Data":"86d72c9f6066ce2d15d3e5f4133f7e9f32b85c81cea4d75c03ff8fef2121e66b"} Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.383750 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.517577 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") pod \"4ffda3d5-82a2-4a0c-9052-2546188c107a\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.518091 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") pod \"4ffda3d5-82a2-4a0c-9052-2546188c107a\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.518212 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") pod \"4ffda3d5-82a2-4a0c-9052-2546188c107a\" (UID: \"4ffda3d5-82a2-4a0c-9052-2546188c107a\") " Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.520815 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle" (OuterVolumeSpecName: "bundle") pod "4ffda3d5-82a2-4a0c-9052-2546188c107a" (UID: "4ffda3d5-82a2-4a0c-9052-2546188c107a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.526068 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2" (OuterVolumeSpecName: "kube-api-access-rhtj2") pod "4ffda3d5-82a2-4a0c-9052-2546188c107a" (UID: "4ffda3d5-82a2-4a0c-9052-2546188c107a"). InnerVolumeSpecName "kube-api-access-rhtj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.537997 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util" (OuterVolumeSpecName: "util") pod "4ffda3d5-82a2-4a0c-9052-2546188c107a" (UID: "4ffda3d5-82a2-4a0c-9052-2546188c107a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.620687 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.620772 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhtj2\" (UniqueName: \"kubernetes.io/projected/4ffda3d5-82a2-4a0c-9052-2546188c107a-kube-api-access-rhtj2\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:51 crc kubenswrapper[5094]: I0220 07:01:51.620788 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ffda3d5-82a2-4a0c-9052-2546188c107a-util\") on node \"crc\" DevicePath \"\"" Feb 20 07:01:52 crc kubenswrapper[5094]: I0220 07:01:52.033866 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" Feb 20 07:01:52 crc kubenswrapper[5094]: I0220 07:01:52.033887 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx" event={"ID":"4ffda3d5-82a2-4a0c-9052-2546188c107a","Type":"ContainerDied","Data":"6a7724051ddaaa78a6755cc4cbefdaa122af49429cca7f0feaad15fbed9ac2e9"} Feb 20 07:01:52 crc kubenswrapper[5094]: I0220 07:01:52.033953 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a7724051ddaaa78a6755cc4cbefdaa122af49429cca7f0feaad15fbed9ac2e9" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.874060 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:01:57 crc kubenswrapper[5094]: E0220 07:01:57.875153 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="util" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.875177 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="util" Feb 20 07:01:57 crc kubenswrapper[5094]: E0220 07:01:57.875202 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="pull" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.875217 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="pull" Feb 20 07:01:57 crc kubenswrapper[5094]: E0220 07:01:57.875253 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="extract" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.875265 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="extract" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.875473 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffda3d5-82a2-4a0c-9052-2546188c107a" containerName="extract" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.877016 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:57 crc kubenswrapper[5094]: I0220 07:01:57.884786 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.026042 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.026119 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.026211 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.127408 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.127479 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.127553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.127999 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.128329 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.162048 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") pod \"community-operators-87drp\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.214411 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:01:58 crc kubenswrapper[5094]: I0220 07:01:58.549523 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:01:59 crc kubenswrapper[5094]: I0220 07:01:59.084757 5094 generic.go:334] "Generic (PLEG): container finished" podID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerID="ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc" exitCode=0 Feb 20 07:01:59 crc kubenswrapper[5094]: I0220 07:01:59.084865 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerDied","Data":"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc"} Feb 20 07:01:59 crc kubenswrapper[5094]: I0220 07:01:59.085087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerStarted","Data":"a942b404d650c5574e3a557cf6ef05b40134fdcd0171aaf5ba7a37ca35943970"} Feb 20 07:02:00 crc kubenswrapper[5094]: I0220 07:02:00.106450 5094 generic.go:334] "Generic (PLEG): container finished" podID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerID="1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977" exitCode=0 Feb 20 07:02:00 crc kubenswrapper[5094]: I0220 07:02:00.106516 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerDied","Data":"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977"} Feb 20 07:02:01 crc kubenswrapper[5094]: I0220 07:02:01.116876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerStarted","Data":"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0"} Feb 20 07:02:01 crc kubenswrapper[5094]: I0220 07:02:01.150470 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-87drp" podStartSLOduration=2.541090217 podStartE2EDuration="4.15044742s" podCreationTimestamp="2026-02-20 07:01:57 +0000 UTC" firstStartedPulling="2026-02-20 07:01:59.087166986 +0000 UTC m=+933.959793697" lastFinishedPulling="2026-02-20 07:02:00.696524179 +0000 UTC m=+935.569150900" observedRunningTime="2026-02-20 07:02:01.145460961 +0000 UTC m=+936.018087682" watchObservedRunningTime="2026-02-20 07:02:01.15044742 +0000 UTC m=+936.023074131" Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.106954 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.107561 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.107648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.108817 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:02:04 crc kubenswrapper[5094]: I0220 07:02:04.108928 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6" gracePeriod=600 Feb 20 07:02:05 crc kubenswrapper[5094]: I0220 07:02:05.156033 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6" exitCode=0 Feb 20 07:02:05 crc kubenswrapper[5094]: I0220 07:02:05.156136 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6"} Feb 20 07:02:05 crc kubenswrapper[5094]: I0220 07:02:05.156803 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d"} Feb 20 07:02:05 crc kubenswrapper[5094]: I0220 07:02:05.156834 5094 scope.go:117] "RemoveContainer" containerID="935ee674e42fed17c73f6a106b1b7b34bda161038510a874c1b2347b0ce2c2b3" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.697914 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv"] Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.698875 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.701449 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.702295 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.702756 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.702820 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-mmdpk" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.703163 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.724474 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv"] Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.865503 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-apiservice-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.865582 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7nrl\" (UniqueName: \"kubernetes.io/projected/059e3724-d657-4f2e-beec-f4f55e09e498-kube-api-access-w7nrl\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.865843 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-webhook-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.958992 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf"] Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.959825 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.965455 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.965471 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.965553 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-dvdgj" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.967449 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-apiservice-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.967514 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7nrl\" (UniqueName: \"kubernetes.io/projected/059e3724-d657-4f2e-beec-f4f55e09e498-kube-api-access-w7nrl\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.967589 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-webhook-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.985260 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-webhook-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:06 crc kubenswrapper[5094]: I0220 07:02:06.985729 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/059e3724-d657-4f2e-beec-f4f55e09e498-apiservice-cert\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.003609 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7nrl\" (UniqueName: \"kubernetes.io/projected/059e3724-d657-4f2e-beec-f4f55e09e498-kube-api-access-w7nrl\") pod \"metallb-operator-controller-manager-6d969f468d-fd5gv\" (UID: \"059e3724-d657-4f2e-beec-f4f55e09e498\") " pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.003739 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf"] Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.026224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.069432 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-webhook-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.069546 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmt9k\" (UniqueName: \"kubernetes.io/projected/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-kube-api-access-hmt9k\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.069616 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-apiservice-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.171801 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-apiservice-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.172291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-webhook-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.172332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmt9k\" (UniqueName: \"kubernetes.io/projected/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-kube-api-access-hmt9k\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.179594 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-apiservice-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.180630 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-webhook-cert\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.197260 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmt9k\" (UniqueName: \"kubernetes.io/projected/2dde7604-2a93-4dc0-9b15-b8fe41f79e1e-kube-api-access-hmt9k\") pod \"metallb-operator-webhook-server-c5fbff78-jk6cf\" (UID: \"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e\") " pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.298039 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv"] Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.336541 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:07 crc kubenswrapper[5094]: I0220 07:02:07.550779 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf"] Feb 20 07:02:07 crc kubenswrapper[5094]: W0220 07:02:07.559064 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dde7604_2a93_4dc0_9b15_b8fe41f79e1e.slice/crio-322c8144164d6f2b2fb535de382420d37d195a35068710e6a9e89938f724128b WatchSource:0}: Error finding container 322c8144164d6f2b2fb535de382420d37d195a35068710e6a9e89938f724128b: Status 404 returned error can't find the container with id 322c8144164d6f2b2fb535de382420d37d195a35068710e6a9e89938f724128b Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.203740 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" event={"ID":"059e3724-d657-4f2e-beec-f4f55e09e498","Type":"ContainerStarted","Data":"bf300391a6d05acf75f7ec345df4af7a3803a3f8893829a2dc835ae88ddb3feb"} Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.205990 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" event={"ID":"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e","Type":"ContainerStarted","Data":"322c8144164d6f2b2fb535de382420d37d195a35068710e6a9e89938f724128b"} Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.215800 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.215871 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:08 crc kubenswrapper[5094]: I0220 07:02:08.273731 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:09 crc kubenswrapper[5094]: I0220 07:02:09.261179 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:11 crc kubenswrapper[5094]: I0220 07:02:11.868186 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:02:11 crc kubenswrapper[5094]: I0220 07:02:11.868853 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-87drp" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="registry-server" containerID="cri-o://40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" gracePeriod=2 Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.231965 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.260176 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" event={"ID":"2dde7604-2a93-4dc0-9b15-b8fe41f79e1e","Type":"ContainerStarted","Data":"7b078e96d2ded7833eeddb5001d0076622f42a122bc025e2075d3c5da6e5fc85"} Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.260298 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262624 5094 generic.go:334] "Generic (PLEG): container finished" podID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerID="40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" exitCode=0 Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262681 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerDied","Data":"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0"} Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262733 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-87drp" event={"ID":"15dcf959-ddef-4835-a1fc-21247f8d81d4","Type":"ContainerDied","Data":"a942b404d650c5574e3a557cf6ef05b40134fdcd0171aaf5ba7a37ca35943970"} Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262755 5094 scope.go:117] "RemoveContainer" containerID="40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.262873 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-87drp" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.266012 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" event={"ID":"059e3724-d657-4f2e-beec-f4f55e09e498","Type":"ContainerStarted","Data":"4ca87069728b97927b3f16eaaf6d81d28815ea82f1ca04ff56de39baad6ac450"} Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.266519 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.278983 5094 scope.go:117] "RemoveContainer" containerID="1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.282677 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" podStartSLOduration=1.897471326 podStartE2EDuration="6.282664848s" podCreationTimestamp="2026-02-20 07:02:06 +0000 UTC" firstStartedPulling="2026-02-20 07:02:07.563274682 +0000 UTC m=+942.435901403" lastFinishedPulling="2026-02-20 07:02:11.948468214 +0000 UTC m=+946.821094925" observedRunningTime="2026-02-20 07:02:12.280583828 +0000 UTC m=+947.153210539" watchObservedRunningTime="2026-02-20 07:02:12.282664848 +0000 UTC m=+947.155291559" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.301025 5094 scope.go:117] "RemoveContainer" containerID="ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.310089 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" podStartSLOduration=3.50765722 podStartE2EDuration="6.310068655s" podCreationTimestamp="2026-02-20 07:02:06 +0000 UTC" firstStartedPulling="2026-02-20 07:02:07.321781289 +0000 UTC m=+942.194408000" lastFinishedPulling="2026-02-20 07:02:10.124192724 +0000 UTC m=+944.996819435" observedRunningTime="2026-02-20 07:02:12.304658145 +0000 UTC m=+947.177284856" watchObservedRunningTime="2026-02-20 07:02:12.310068655 +0000 UTC m=+947.182695366" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.319982 5094 scope.go:117] "RemoveContainer" containerID="40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" Feb 20 07:02:12 crc kubenswrapper[5094]: E0220 07:02:12.320658 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0\": container with ID starting with 40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0 not found: ID does not exist" containerID="40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.320724 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0"} err="failed to get container status \"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0\": rpc error: code = NotFound desc = could not find container \"40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0\": container with ID starting with 40e3a68506e2b6947da889b0b7ed2574ae46c81e0eb0bf99ba467197f227a4a0 not found: ID does not exist" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.320755 5094 scope.go:117] "RemoveContainer" containerID="1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977" Feb 20 07:02:12 crc kubenswrapper[5094]: E0220 07:02:12.321175 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977\": container with ID starting with 1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977 not found: ID does not exist" containerID="1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.321208 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977"} err="failed to get container status \"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977\": rpc error: code = NotFound desc = could not find container \"1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977\": container with ID starting with 1a192e2a4da6e1385050ac77e7224362961f35c186a994879fded218b12b2977 not found: ID does not exist" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.321229 5094 scope.go:117] "RemoveContainer" containerID="ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc" Feb 20 07:02:12 crc kubenswrapper[5094]: E0220 07:02:12.321561 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc\": container with ID starting with ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc not found: ID does not exist" containerID="ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.321584 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc"} err="failed to get container status \"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc\": rpc error: code = NotFound desc = could not find container \"ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc\": container with ID starting with ee705756cd462fc8503463c9f1feec3f3088c9f60a1f9bcbdd33ccd0b49c34dc not found: ID does not exist" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.353860 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") pod \"15dcf959-ddef-4835-a1fc-21247f8d81d4\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.353926 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") pod \"15dcf959-ddef-4835-a1fc-21247f8d81d4\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.353961 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") pod \"15dcf959-ddef-4835-a1fc-21247f8d81d4\" (UID: \"15dcf959-ddef-4835-a1fc-21247f8d81d4\") " Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.356801 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities" (OuterVolumeSpecName: "utilities") pod "15dcf959-ddef-4835-a1fc-21247f8d81d4" (UID: "15dcf959-ddef-4835-a1fc-21247f8d81d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.361524 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf" (OuterVolumeSpecName: "kube-api-access-vj5wf") pod "15dcf959-ddef-4835-a1fc-21247f8d81d4" (UID: "15dcf959-ddef-4835-a1fc-21247f8d81d4"). InnerVolumeSpecName "kube-api-access-vj5wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.412191 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "15dcf959-ddef-4835-a1fc-21247f8d81d4" (UID: "15dcf959-ddef-4835-a1fc-21247f8d81d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.455516 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.455556 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj5wf\" (UniqueName: \"kubernetes.io/projected/15dcf959-ddef-4835-a1fc-21247f8d81d4-kube-api-access-vj5wf\") on node \"crc\" DevicePath \"\"" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.455570 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15dcf959-ddef-4835-a1fc-21247f8d81d4-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.593343 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:02:12 crc kubenswrapper[5094]: I0220 07:02:12.599401 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-87drp"] Feb 20 07:02:13 crc kubenswrapper[5094]: I0220 07:02:13.846852 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" path="/var/lib/kubelet/pods/15dcf959-ddef-4835-a1fc-21247f8d81d4/volumes" Feb 20 07:02:27 crc kubenswrapper[5094]: I0220 07:02:27.340280 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-c5fbff78-jk6cf" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.030660 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.822684 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-4p57m"] Feb 20 07:02:47 crc kubenswrapper[5094]: E0220 07:02:47.823459 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="extract-utilities" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.823478 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="extract-utilities" Feb 20 07:02:47 crc kubenswrapper[5094]: E0220 07:02:47.823500 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="registry-server" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.823511 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="registry-server" Feb 20 07:02:47 crc kubenswrapper[5094]: E0220 07:02:47.823534 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="extract-content" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.823543 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="extract-content" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.823677 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="15dcf959-ddef-4835-a1fc-21247f8d81d4" containerName="registry-server" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.826207 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.832149 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6"] Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.833101 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.837096 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.837106 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.837358 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-26hgd" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.856154 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6"] Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.861986 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914507 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxmcp\" (UniqueName: \"kubernetes.io/projected/f065adc1-f6c1-4895-a933-906a708555c1-kube-api-access-rxmcp\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914588 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe469d05-edeb-4d23-b06b-6bdbfc646e99-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914617 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-reloader\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-conf\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-sockets\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-metrics\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.914907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh9qq\" (UniqueName: \"kubernetes.io/projected/fe469d05-edeb-4d23-b06b-6bdbfc646e99-kube-api-access-vh9qq\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.915039 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f065adc1-f6c1-4895-a933-906a708555c1-metrics-certs\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.915113 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f065adc1-f6c1-4895-a933-906a708555c1-frr-startup\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.939476 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gjp5f"] Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.940554 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gjp5f" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.943369 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.945362 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nrnbq" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.945487 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.945523 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.960635 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-s7ndd"] Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.961766 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.965264 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 20 07:02:47 crc kubenswrapper[5094]: I0220 07:02:47.995507 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-s7ndd"] Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.016908 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-reloader\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.016971 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t86x\" (UniqueName: \"kubernetes.io/projected/4d145cb8-0c5c-40f7-a99c-15f1575629c3-kube-api-access-8t86x\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.016994 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-conf\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017136 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-metrics-certs\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-sockets\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017313 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-metrics\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017492 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-reloader\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh9qq\" (UniqueName: \"kubernetes.io/projected/fe469d05-edeb-4d23-b06b-6bdbfc646e99-kube-api-access-vh9qq\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017772 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-metrics\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-conf\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.017990 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/f065adc1-f6c1-4895-a933-906a708555c1-frr-sockets\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.019868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metallb-excludel2\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020018 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f065adc1-f6c1-4895-a933-906a708555c1-metrics-certs\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f065adc1-f6c1-4895-a933-906a708555c1-frr-startup\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020389 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxmcp\" (UniqueName: \"kubernetes.io/projected/f065adc1-f6c1-4895-a933-906a708555c1-kube-api-access-rxmcp\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-cert\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020631 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metrics-certs\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmxh8\" (UniqueName: \"kubernetes.io/projected/2a03b7d3-8e22-4a62-98f0-8d72500fab69-kube-api-access-tmxh8\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.020686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe469d05-edeb-4d23-b06b-6bdbfc646e99-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.021173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/f065adc1-f6c1-4895-a933-906a708555c1-frr-startup\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.029549 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f065adc1-f6c1-4895-a933-906a708555c1-metrics-certs\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.035286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxmcp\" (UniqueName: \"kubernetes.io/projected/f065adc1-f6c1-4895-a933-906a708555c1-kube-api-access-rxmcp\") pod \"frr-k8s-4p57m\" (UID: \"f065adc1-f6c1-4895-a933-906a708555c1\") " pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.035980 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh9qq\" (UniqueName: \"kubernetes.io/projected/fe469d05-edeb-4d23-b06b-6bdbfc646e99-kube-api-access-vh9qq\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.055830 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fe469d05-edeb-4d23-b06b-6bdbfc646e99-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-dljs6\" (UID: \"fe469d05-edeb-4d23-b06b-6bdbfc646e99\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.121913 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-cert\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.121961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metrics-certs\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.121987 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmxh8\" (UniqueName: \"kubernetes.io/projected/2a03b7d3-8e22-4a62-98f0-8d72500fab69-kube-api-access-tmxh8\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t86x\" (UniqueName: \"kubernetes.io/projected/4d145cb8-0c5c-40f7-a99c-15f1575629c3-kube-api-access-8t86x\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122047 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122067 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-metrics-certs\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122091 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metallb-excludel2\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.122863 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metallb-excludel2\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: E0220 07:02:48.122974 5094 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 07:02:48 crc kubenswrapper[5094]: E0220 07:02:48.123029 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist podName:4d145cb8-0c5c-40f7-a99c-15f1575629c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:02:48.623010817 +0000 UTC m=+983.495637528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist") pod "speaker-gjp5f" (UID: "4d145cb8-0c5c-40f7-a99c-15f1575629c3") : secret "metallb-memberlist" not found Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.124869 5094 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.127555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-metrics-certs\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.127770 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-metrics-certs\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.136385 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a03b7d3-8e22-4a62-98f0-8d72500fab69-cert\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.140468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmxh8\" (UniqueName: \"kubernetes.io/projected/2a03b7d3-8e22-4a62-98f0-8d72500fab69-kube-api-access-tmxh8\") pod \"controller-69bbfbf88f-s7ndd\" (UID: \"2a03b7d3-8e22-4a62-98f0-8d72500fab69\") " pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.141723 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t86x\" (UniqueName: \"kubernetes.io/projected/4d145cb8-0c5c-40f7-a99c-15f1575629c3-kube-api-access-8t86x\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.149230 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.164496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.280253 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.493397 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-s7ndd"] Feb 20 07:02:48 crc kubenswrapper[5094]: W0220 07:02:48.500209 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a03b7d3_8e22_4a62_98f0_8d72500fab69.slice/crio-194f3b628ec5b75608cf438ca9a8aaf9c27c5fb0b5f65a21faeb197c62701c5d WatchSource:0}: Error finding container 194f3b628ec5b75608cf438ca9a8aaf9c27c5fb0b5f65a21faeb197c62701c5d: Status 404 returned error can't find the container with id 194f3b628ec5b75608cf438ca9a8aaf9c27c5fb0b5f65a21faeb197c62701c5d Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.543903 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"db851ee29f42c86621bfd89bef428bf083277405675601d774f432ee4fd656f8"} Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.545454 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-s7ndd" event={"ID":"2a03b7d3-8e22-4a62-98f0-8d72500fab69","Type":"ContainerStarted","Data":"194f3b628ec5b75608cf438ca9a8aaf9c27c5fb0b5f65a21faeb197c62701c5d"} Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.592834 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6"] Feb 20 07:02:48 crc kubenswrapper[5094]: W0220 07:02:48.594528 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe469d05_edeb_4d23_b06b_6bdbfc646e99.slice/crio-f495c902158f8adc1974f3c51c7357ebae7fcce6111f9164310daa3b57b99ad3 WatchSource:0}: Error finding container f495c902158f8adc1974f3c51c7357ebae7fcce6111f9164310daa3b57b99ad3: Status 404 returned error can't find the container with id f495c902158f8adc1974f3c51c7357ebae7fcce6111f9164310daa3b57b99ad3 Feb 20 07:02:48 crc kubenswrapper[5094]: I0220 07:02:48.641038 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:48 crc kubenswrapper[5094]: E0220 07:02:48.641255 5094 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 20 07:02:48 crc kubenswrapper[5094]: E0220 07:02:48.641317 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist podName:4d145cb8-0c5c-40f7-a99c-15f1575629c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:02:49.641300649 +0000 UTC m=+984.513927360 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist") pod "speaker-gjp5f" (UID: "4d145cb8-0c5c-40f7-a99c-15f1575629c3") : secret "metallb-memberlist" not found Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.555335 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-s7ndd" event={"ID":"2a03b7d3-8e22-4a62-98f0-8d72500fab69","Type":"ContainerStarted","Data":"82d0bf0b027a6c2af72dd4a0802d131e09d95fe1c345e059a39a39206cb92bd8"} Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.555921 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-s7ndd" event={"ID":"2a03b7d3-8e22-4a62-98f0-8d72500fab69","Type":"ContainerStarted","Data":"01b95510a80fea6eeec2b14943604b8e6613411df3e75aa19ad70a120cc86791"} Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.555942 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.558193 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" event={"ID":"fe469d05-edeb-4d23-b06b-6bdbfc646e99","Type":"ContainerStarted","Data":"f495c902158f8adc1974f3c51c7357ebae7fcce6111f9164310daa3b57b99ad3"} Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.585151 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-s7ndd" podStartSLOduration=2.585127843 podStartE2EDuration="2.585127843s" podCreationTimestamp="2026-02-20 07:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:02:49.578160186 +0000 UTC m=+984.450786897" watchObservedRunningTime="2026-02-20 07:02:49.585127843 +0000 UTC m=+984.457754554" Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.668869 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.675556 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4d145cb8-0c5c-40f7-a99c-15f1575629c3-memberlist\") pod \"speaker-gjp5f\" (UID: \"4d145cb8-0c5c-40f7-a99c-15f1575629c3\") " pod="metallb-system/speaker-gjp5f" Feb 20 07:02:49 crc kubenswrapper[5094]: I0220 07:02:49.755203 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gjp5f" Feb 20 07:02:49 crc kubenswrapper[5094]: W0220 07:02:49.782193 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d145cb8_0c5c_40f7_a99c_15f1575629c3.slice/crio-4a1ae9dff887422ddf24fabb2f8cd51e0df3b31c0eeb4c7797c825b69c59a972 WatchSource:0}: Error finding container 4a1ae9dff887422ddf24fabb2f8cd51e0df3b31c0eeb4c7797c825b69c59a972: Status 404 returned error can't find the container with id 4a1ae9dff887422ddf24fabb2f8cd51e0df3b31c0eeb4c7797c825b69c59a972 Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.568679 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gjp5f" event={"ID":"4d145cb8-0c5c-40f7-a99c-15f1575629c3","Type":"ContainerStarted","Data":"488664b5bc5adbeb38221a500978db23d753c6a8ee17485fbf0e6f5a3dd75897"} Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.569209 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gjp5f" event={"ID":"4d145cb8-0c5c-40f7-a99c-15f1575629c3","Type":"ContainerStarted","Data":"8be93dc71a92facf56e5abb04779c569f66346b846440a9174bad148ecbc58f6"} Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.569227 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gjp5f" event={"ID":"4d145cb8-0c5c-40f7-a99c-15f1575629c3","Type":"ContainerStarted","Data":"4a1ae9dff887422ddf24fabb2f8cd51e0df3b31c0eeb4c7797c825b69c59a972"} Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.569458 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gjp5f" Feb 20 07:02:50 crc kubenswrapper[5094]: I0220 07:02:50.594893 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gjp5f" podStartSLOduration=3.594872206 podStartE2EDuration="3.594872206s" podCreationTimestamp="2026-02-20 07:02:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:02:50.58582415 +0000 UTC m=+985.458450871" watchObservedRunningTime="2026-02-20 07:02:50.594872206 +0000 UTC m=+985.467498937" Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.621682 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" event={"ID":"fe469d05-edeb-4d23-b06b-6bdbfc646e99","Type":"ContainerStarted","Data":"a4f4e3d94829bcb20bb68586c26fe960c2872a13289025271ed153f7f35abced"} Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.622822 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.627249 5094 generic.go:334] "Generic (PLEG): container finished" podID="f065adc1-f6c1-4895-a933-906a708555c1" containerID="a3af44e0f4e02e8a8325d470de9b0b4bc5dbd143660fd690507fd27f9ba720c9" exitCode=0 Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.627293 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerDied","Data":"a3af44e0f4e02e8a8325d470de9b0b4bc5dbd143660fd690507fd27f9ba720c9"} Feb 20 07:02:55 crc kubenswrapper[5094]: I0220 07:02:55.644821 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" podStartSLOduration=1.939134543 podStartE2EDuration="8.644803398s" podCreationTimestamp="2026-02-20 07:02:47 +0000 UTC" firstStartedPulling="2026-02-20 07:02:48.598405782 +0000 UTC m=+983.471032493" lastFinishedPulling="2026-02-20 07:02:55.304074627 +0000 UTC m=+990.176701348" observedRunningTime="2026-02-20 07:02:55.640012313 +0000 UTC m=+990.512639034" watchObservedRunningTime="2026-02-20 07:02:55.644803398 +0000 UTC m=+990.517430109" Feb 20 07:02:56 crc kubenswrapper[5094]: I0220 07:02:56.651817 5094 generic.go:334] "Generic (PLEG): container finished" podID="f065adc1-f6c1-4895-a933-906a708555c1" containerID="3bbff53e5707494cd376f154bb19901ef4f1364b1823ba367ced8c70ab66dde0" exitCode=0 Feb 20 07:02:56 crc kubenswrapper[5094]: I0220 07:02:56.652377 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerDied","Data":"3bbff53e5707494cd376f154bb19901ef4f1364b1823ba367ced8c70ab66dde0"} Feb 20 07:02:57 crc kubenswrapper[5094]: I0220 07:02:57.663855 5094 generic.go:334] "Generic (PLEG): container finished" podID="f065adc1-f6c1-4895-a933-906a708555c1" containerID="21c349f70a9e4f427492dfd3250103d18c4b74f84647fb9189af408beec72719" exitCode=0 Feb 20 07:02:57 crc kubenswrapper[5094]: I0220 07:02:57.663945 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerDied","Data":"21c349f70a9e4f427492dfd3250103d18c4b74f84647fb9189af408beec72719"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.285405 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-s7ndd" Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677247 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"3582500ece9162a401385f04947e726da321096541453bb35d380a32493b15ab"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677307 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"c7ef767e01941b707e320790173c642449e6d1247c24d7d543261cdb6b3774bc"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677318 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"479971f48bcdfa12fa91407bc456683e1f99728d0366fb8fd1a1e825a16c85fe"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677328 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"b2ec4dc1a1aacaf00209de2de8a58cefb5293c65b6ccf1ebab4f97134c587fc2"} Feb 20 07:02:58 crc kubenswrapper[5094]: I0220 07:02:58.677337 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"af1c47ff236eab0ec8b623533b0f5c7ae98507e0c3d152394d037fec855ad13f"} Feb 20 07:02:59 crc kubenswrapper[5094]: I0220 07:02:59.693689 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-4p57m" event={"ID":"f065adc1-f6c1-4895-a933-906a708555c1","Type":"ContainerStarted","Data":"cd6b82792d0d03f7b56ed72a30cb17fc94fc0279488e1b7feec76953d082b3d2"} Feb 20 07:02:59 crc kubenswrapper[5094]: I0220 07:02:59.693982 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:02:59 crc kubenswrapper[5094]: I0220 07:02:59.743576 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-4p57m" podStartSLOduration=5.777964179 podStartE2EDuration="12.743538279s" podCreationTimestamp="2026-02-20 07:02:47 +0000 UTC" firstStartedPulling="2026-02-20 07:02:48.307799173 +0000 UTC m=+983.180425904" lastFinishedPulling="2026-02-20 07:02:55.273373293 +0000 UTC m=+990.146000004" observedRunningTime="2026-02-20 07:02:59.737916465 +0000 UTC m=+994.610543166" watchObservedRunningTime="2026-02-20 07:02:59.743538279 +0000 UTC m=+994.616165030" Feb 20 07:03:03 crc kubenswrapper[5094]: I0220 07:03:03.150110 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:03:03 crc kubenswrapper[5094]: I0220 07:03:03.216021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:03:08 crc kubenswrapper[5094]: I0220 07:03:08.155865 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-4p57m" Feb 20 07:03:08 crc kubenswrapper[5094]: I0220 07:03:08.169428 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" Feb 20 07:03:09 crc kubenswrapper[5094]: I0220 07:03:09.760776 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gjp5f" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.315686 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26"] Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.318752 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.320852 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26"] Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.325739 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.367939 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.367995 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.368024 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.469258 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.469363 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.469420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.470576 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.470857 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.494631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.647855 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:11 crc kubenswrapper[5094]: I0220 07:03:11.875746 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26"] Feb 20 07:03:11 crc kubenswrapper[5094]: W0220 07:03:11.888032 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67055673_f25d_44d3_99e5_2ac1474b1872.slice/crio-a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2 WatchSource:0}: Error finding container a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2: Status 404 returned error can't find the container with id a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2 Feb 20 07:03:12 crc kubenswrapper[5094]: I0220 07:03:12.817601 5094 generic.go:334] "Generic (PLEG): container finished" podID="67055673-f25d-44d3-99e5-2ac1474b1872" containerID="1da52198b68563bd53f89d4575f494e4f72f2f9554fa289e7566682c09e4ef00" exitCode=0 Feb 20 07:03:12 crc kubenswrapper[5094]: I0220 07:03:12.817760 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerDied","Data":"1da52198b68563bd53f89d4575f494e4f72f2f9554fa289e7566682c09e4ef00"} Feb 20 07:03:12 crc kubenswrapper[5094]: I0220 07:03:12.819036 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerStarted","Data":"a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2"} Feb 20 07:03:16 crc kubenswrapper[5094]: I0220 07:03:16.854849 5094 generic.go:334] "Generic (PLEG): container finished" podID="67055673-f25d-44d3-99e5-2ac1474b1872" containerID="46af8a535f540b4e725b7d665b3d691bcbfe11d0f6da44e3eeb4d68f82a2f9af" exitCode=0 Feb 20 07:03:16 crc kubenswrapper[5094]: I0220 07:03:16.854913 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerDied","Data":"46af8a535f540b4e725b7d665b3d691bcbfe11d0f6da44e3eeb4d68f82a2f9af"} Feb 20 07:03:17 crc kubenswrapper[5094]: I0220 07:03:17.873296 5094 generic.go:334] "Generic (PLEG): container finished" podID="67055673-f25d-44d3-99e5-2ac1474b1872" containerID="cdefbd01dfaf8a33479b0145b8ed11e8938f4b0935863e19a04a08a68cef9f59" exitCode=0 Feb 20 07:03:17 crc kubenswrapper[5094]: I0220 07:03:17.873377 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerDied","Data":"cdefbd01dfaf8a33479b0145b8ed11e8938f4b0935863e19a04a08a68cef9f59"} Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.328764 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.436138 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") pod \"67055673-f25d-44d3-99e5-2ac1474b1872\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.436354 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") pod \"67055673-f25d-44d3-99e5-2ac1474b1872\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.436439 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") pod \"67055673-f25d-44d3-99e5-2ac1474b1872\" (UID: \"67055673-f25d-44d3-99e5-2ac1474b1872\") " Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.438677 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle" (OuterVolumeSpecName: "bundle") pod "67055673-f25d-44d3-99e5-2ac1474b1872" (UID: "67055673-f25d-44d3-99e5-2ac1474b1872"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.447329 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2" (OuterVolumeSpecName: "kube-api-access-vgqz2") pod "67055673-f25d-44d3-99e5-2ac1474b1872" (UID: "67055673-f25d-44d3-99e5-2ac1474b1872"). InnerVolumeSpecName "kube-api-access-vgqz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.459261 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util" (OuterVolumeSpecName: "util") pod "67055673-f25d-44d3-99e5-2ac1474b1872" (UID: "67055673-f25d-44d3-99e5-2ac1474b1872"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.538804 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-util\") on node \"crc\" DevicePath \"\"" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.538869 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67055673-f25d-44d3-99e5-2ac1474b1872-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.538891 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgqz2\" (UniqueName: \"kubernetes.io/projected/67055673-f25d-44d3-99e5-2ac1474b1872-kube-api-access-vgqz2\") on node \"crc\" DevicePath \"\"" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.900396 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" event={"ID":"67055673-f25d-44d3-99e5-2ac1474b1872","Type":"ContainerDied","Data":"a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2"} Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.900880 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a06fb2e63973050f3ef6d3689ccd00add9e5f2c7009337beef4233767ff022a2" Feb 20 07:03:19 crc kubenswrapper[5094]: I0220 07:03:19.900466 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.406595 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864"] Feb 20 07:03:24 crc kubenswrapper[5094]: E0220 07:03:24.407624 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="pull" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.407642 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="pull" Feb 20 07:03:24 crc kubenswrapper[5094]: E0220 07:03:24.407656 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="util" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.407662 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="util" Feb 20 07:03:24 crc kubenswrapper[5094]: E0220 07:03:24.407675 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="extract" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.407681 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="extract" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.407831 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="67055673-f25d-44d3-99e5-2ac1474b1872" containerName="extract" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.408367 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.419693 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8736e3bf-949d-48fb-a246-83adc37708df-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.419691 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.419964 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b446m\" (UniqueName: \"kubernetes.io/projected/8736e3bf-949d-48fb-a246-83adc37708df-kube-api-access-b446m\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.420407 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.422194 5094 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-gj5vk" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.443290 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864"] Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.521925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8736e3bf-949d-48fb-a246-83adc37708df-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.522027 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b446m\" (UniqueName: \"kubernetes.io/projected/8736e3bf-949d-48fb-a246-83adc37708df-kube-api-access-b446m\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.522909 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8736e3bf-949d-48fb-a246-83adc37708df-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.556880 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b446m\" (UniqueName: \"kubernetes.io/projected/8736e3bf-949d-48fb-a246-83adc37708df-kube-api-access-b446m\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fc864\" (UID: \"8736e3bf-949d-48fb-a246-83adc37708df\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:24 crc kubenswrapper[5094]: I0220 07:03:24.734153 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" Feb 20 07:03:25 crc kubenswrapper[5094]: I0220 07:03:25.019721 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864"] Feb 20 07:03:25 crc kubenswrapper[5094]: I0220 07:03:25.965019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" event={"ID":"8736e3bf-949d-48fb-a246-83adc37708df","Type":"ContainerStarted","Data":"90fd8fcb765ed206b923255dde2caf620239eb46c6dd821a503c263e73384e7f"} Feb 20 07:03:29 crc kubenswrapper[5094]: I0220 07:03:29.000527 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" event={"ID":"8736e3bf-949d-48fb-a246-83adc37708df","Type":"ContainerStarted","Data":"276d433b83ee5155b9dc708d6dc7de577bd267260a14381bcca1d992a65a461c"} Feb 20 07:03:29 crc kubenswrapper[5094]: I0220 07:03:29.035037 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fc864" podStartSLOduration=1.626340323 podStartE2EDuration="5.035007907s" podCreationTimestamp="2026-02-20 07:03:24 +0000 UTC" firstStartedPulling="2026-02-20 07:03:25.039113489 +0000 UTC m=+1019.911740200" lastFinishedPulling="2026-02-20 07:03:28.447781063 +0000 UTC m=+1023.320407784" observedRunningTime="2026-02-20 07:03:29.027165229 +0000 UTC m=+1023.899791950" watchObservedRunningTime="2026-02-20 07:03:29.035007907 +0000 UTC m=+1023.907634658" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.650158 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-sxrw7"] Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.651883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.654340 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.654765 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.655125 5094 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pp4px" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.666958 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-sxrw7"] Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.771662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-795dd\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-kube-api-access-795dd\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.771752 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.873229 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-795dd\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-kube-api-access-795dd\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.873326 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.897293 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-795dd\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-kube-api-access-795dd\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.901834 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc1f2312-eb97-4f63-b37b-975d9dfb5a73-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-sxrw7\" (UID: \"bc1f2312-eb97-4f63-b37b-975d9dfb5a73\") " pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:33 crc kubenswrapper[5094]: I0220 07:03:33.982327 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:34 crc kubenswrapper[5094]: I0220 07:03:34.494903 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-sxrw7"] Feb 20 07:03:35 crc kubenswrapper[5094]: I0220 07:03:35.045555 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" event={"ID":"bc1f2312-eb97-4f63-b37b-975d9dfb5a73","Type":"ContainerStarted","Data":"a040f4dfe7b135477a3354c3b1349616346ab73cadd25e233d7882bd6138c00f"} Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.449783 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-mtw89"] Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.454478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.457488 5094 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-zjv8f" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.470500 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-mtw89"] Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.515723 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.515772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c847p\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-kube-api-access-c847p\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.617034 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.617093 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c847p\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-kube-api-access-c847p\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.643651 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.646555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c847p\" (UniqueName: \"kubernetes.io/projected/34f53f0e-6a22-42c9-a953-3ec38e87a70f-kube-api-access-c847p\") pod \"cert-manager-cainjector-5545bd876-mtw89\" (UID: \"34f53f0e-6a22-42c9-a953-3ec38e87a70f\") " pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:36 crc kubenswrapper[5094]: I0220 07:03:36.780007 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" Feb 20 07:03:37 crc kubenswrapper[5094]: I0220 07:03:37.197549 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-mtw89"] Feb 20 07:03:37 crc kubenswrapper[5094]: W0220 07:03:37.210187 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34f53f0e_6a22_42c9_a953_3ec38e87a70f.slice/crio-6499dbac1ebbf4f6972f6acd2f422eeb14cd49350e286826a6d9dd9d215ce031 WatchSource:0}: Error finding container 6499dbac1ebbf4f6972f6acd2f422eeb14cd49350e286826a6d9dd9d215ce031: Status 404 returned error can't find the container with id 6499dbac1ebbf4f6972f6acd2f422eeb14cd49350e286826a6d9dd9d215ce031 Feb 20 07:03:38 crc kubenswrapper[5094]: I0220 07:03:38.070770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" event={"ID":"34f53f0e-6a22-42c9-a953-3ec38e87a70f","Type":"ContainerStarted","Data":"6499dbac1ebbf4f6972f6acd2f422eeb14cd49350e286826a6d9dd9d215ce031"} Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.084579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" event={"ID":"bc1f2312-eb97-4f63-b37b-975d9dfb5a73","Type":"ContainerStarted","Data":"f4c1835b58e3139b5342090792e852f6fca1eb07f87f163c9d882a331e7766d5"} Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.085141 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.086173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" event={"ID":"34f53f0e-6a22-42c9-a953-3ec38e87a70f","Type":"ContainerStarted","Data":"af7e1f46a75d66bd1c0de4843cf28caaa41cfb30822018e359a0a2fecc214a0b"} Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.134349 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-mtw89" podStartSLOduration=1.516528272 podStartE2EDuration="4.134311637s" podCreationTimestamp="2026-02-20 07:03:36 +0000 UTC" firstStartedPulling="2026-02-20 07:03:37.213742781 +0000 UTC m=+1032.086369492" lastFinishedPulling="2026-02-20 07:03:39.831526146 +0000 UTC m=+1034.704152857" observedRunningTime="2026-02-20 07:03:40.123405225 +0000 UTC m=+1034.996031976" watchObservedRunningTime="2026-02-20 07:03:40.134311637 +0000 UTC m=+1035.006938368" Feb 20 07:03:40 crc kubenswrapper[5094]: I0220 07:03:40.137050 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" podStartSLOduration=1.7821294060000001 podStartE2EDuration="7.137018522s" podCreationTimestamp="2026-02-20 07:03:33 +0000 UTC" firstStartedPulling="2026-02-20 07:03:34.506144616 +0000 UTC m=+1029.378771337" lastFinishedPulling="2026-02-20 07:03:39.861033742 +0000 UTC m=+1034.733660453" observedRunningTime="2026-02-20 07:03:40.103691383 +0000 UTC m=+1034.976318094" watchObservedRunningTime="2026-02-20 07:03:40.137018522 +0000 UTC m=+1035.009645243" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.406638 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-pdnlx"] Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.408944 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.412162 5094 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-99mmx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.422862 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-pdnlx"] Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.564844 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk4d5\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-kube-api-access-gk4d5\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.564961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-bound-sa-token\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.668268 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk4d5\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-kube-api-access-gk4d5\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.668851 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-bound-sa-token\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.703952 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk4d5\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-kube-api-access-gk4d5\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.706571 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6360113-cdd8-48a4-a145-4b54eb5510eb-bound-sa-token\") pod \"cert-manager-545d4d4674-pdnlx\" (UID: \"d6360113-cdd8-48a4-a145-4b54eb5510eb\") " pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:43 crc kubenswrapper[5094]: I0220 07:03:43.740328 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-pdnlx" Feb 20 07:03:44 crc kubenswrapper[5094]: I0220 07:03:44.323628 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-pdnlx"] Feb 20 07:03:44 crc kubenswrapper[5094]: W0220 07:03:44.343297 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6360113_cdd8_48a4_a145_4b54eb5510eb.slice/crio-ead8a41c806bfe66337ce671ca0b252303a3993e14017bcf3de19807af393c52 WatchSource:0}: Error finding container ead8a41c806bfe66337ce671ca0b252303a3993e14017bcf3de19807af393c52: Status 404 returned error can't find the container with id ead8a41c806bfe66337ce671ca0b252303a3993e14017bcf3de19807af393c52 Feb 20 07:03:45 crc kubenswrapper[5094]: I0220 07:03:45.127831 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-pdnlx" event={"ID":"d6360113-cdd8-48a4-a145-4b54eb5510eb","Type":"ContainerStarted","Data":"b79a77daa180badf772abd691f2f5997fd80d06cc147083eb970116ad6b44067"} Feb 20 07:03:45 crc kubenswrapper[5094]: I0220 07:03:45.128163 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-pdnlx" event={"ID":"d6360113-cdd8-48a4-a145-4b54eb5510eb","Type":"ContainerStarted","Data":"ead8a41c806bfe66337ce671ca0b252303a3993e14017bcf3de19807af393c52"} Feb 20 07:03:45 crc kubenswrapper[5094]: I0220 07:03:45.151693 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-pdnlx" podStartSLOduration=2.151668518 podStartE2EDuration="2.151668518s" podCreationTimestamp="2026-02-20 07:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:03:45.150356427 +0000 UTC m=+1040.022983138" watchObservedRunningTime="2026-02-20 07:03:45.151668518 +0000 UTC m=+1040.024295229" Feb 20 07:03:48 crc kubenswrapper[5094]: I0220 07:03:48.986075 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-sxrw7" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.531286 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.535780 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.543413 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.545009 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-m77f6" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.545047 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.611073 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.638910 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") pod \"openstack-operator-index-nmt6q\" (UID: \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\") " pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.740637 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") pod \"openstack-operator-index-nmt6q\" (UID: \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\") " pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.764636 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") pod \"openstack-operator-index-nmt6q\" (UID: \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\") " pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:52 crc kubenswrapper[5094]: I0220 07:03:52.862431 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:53 crc kubenswrapper[5094]: I0220 07:03:53.366548 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:53 crc kubenswrapper[5094]: W0220 07:03:53.375573 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ef4848_2558_4cf8_bac6_2f5ed78a74af.slice/crio-815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44 WatchSource:0}: Error finding container 815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44: Status 404 returned error can't find the container with id 815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44 Feb 20 07:03:54 crc kubenswrapper[5094]: I0220 07:03:54.202053 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nmt6q" event={"ID":"84ef4848-2558-4cf8-bac6-2f5ed78a74af","Type":"ContainerStarted","Data":"815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44"} Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.214102 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nmt6q" event={"ID":"84ef4848-2558-4cf8-bac6-2f5ed78a74af","Type":"ContainerStarted","Data":"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251"} Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.242263 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nmt6q" podStartSLOduration=2.546658751 podStartE2EDuration="3.242234559s" podCreationTimestamp="2026-02-20 07:03:52 +0000 UTC" firstStartedPulling="2026-02-20 07:03:53.379659142 +0000 UTC m=+1048.252285893" lastFinishedPulling="2026-02-20 07:03:54.07523498 +0000 UTC m=+1048.947861701" observedRunningTime="2026-02-20 07:03:55.237880124 +0000 UTC m=+1050.110506875" watchObservedRunningTime="2026-02-20 07:03:55.242234559 +0000 UTC m=+1050.114861280" Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.297945 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.922012 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-72vrj"] Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.925272 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:55 crc kubenswrapper[5094]: I0220 07:03:55.944350 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72vrj"] Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.002219 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cpnf\" (UniqueName: \"kubernetes.io/projected/be2cc842-778e-4963-80f8-bb5c7426f175-kube-api-access-4cpnf\") pod \"openstack-operator-index-72vrj\" (UID: \"be2cc842-778e-4963-80f8-bb5c7426f175\") " pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.103579 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cpnf\" (UniqueName: \"kubernetes.io/projected/be2cc842-778e-4963-80f8-bb5c7426f175-kube-api-access-4cpnf\") pod \"openstack-operator-index-72vrj\" (UID: \"be2cc842-778e-4963-80f8-bb5c7426f175\") " pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.138687 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cpnf\" (UniqueName: \"kubernetes.io/projected/be2cc842-778e-4963-80f8-bb5c7426f175-kube-api-access-4cpnf\") pod \"openstack-operator-index-72vrj\" (UID: \"be2cc842-778e-4963-80f8-bb5c7426f175\") " pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.267244 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:03:56 crc kubenswrapper[5094]: I0220 07:03:56.848490 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-72vrj"] Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.228842 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72vrj" event={"ID":"be2cc842-778e-4963-80f8-bb5c7426f175","Type":"ContainerStarted","Data":"f5c4c71d04ff2231e43246d1822cb9a245ea195f3ded88f709d65978fdc2022b"} Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.228966 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nmt6q" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerName="registry-server" containerID="cri-o://ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" gracePeriod=2 Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.653823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.739616 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") pod \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\" (UID: \"84ef4848-2558-4cf8-bac6-2f5ed78a74af\") " Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.749564 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm" (OuterVolumeSpecName: "kube-api-access-fzqsm") pod "84ef4848-2558-4cf8-bac6-2f5ed78a74af" (UID: "84ef4848-2558-4cf8-bac6-2f5ed78a74af"). InnerVolumeSpecName "kube-api-access-fzqsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:03:57 crc kubenswrapper[5094]: I0220 07:03:57.841514 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzqsm\" (UniqueName: \"kubernetes.io/projected/84ef4848-2558-4cf8-bac6-2f5ed78a74af-kube-api-access-fzqsm\") on node \"crc\" DevicePath \"\"" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.245853 5094 generic.go:334] "Generic (PLEG): container finished" podID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerID="ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" exitCode=0 Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.245941 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nmt6q" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.245958 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nmt6q" event={"ID":"84ef4848-2558-4cf8-bac6-2f5ed78a74af","Type":"ContainerDied","Data":"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251"} Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.246699 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nmt6q" event={"ID":"84ef4848-2558-4cf8-bac6-2f5ed78a74af","Type":"ContainerDied","Data":"815a90dae64743f7ad59b41441f7594e7eca46d0f3dab56ecdb9ab0d3c224a44"} Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.246762 5094 scope.go:117] "RemoveContainer" containerID="ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.248632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-72vrj" event={"ID":"be2cc842-778e-4963-80f8-bb5c7426f175","Type":"ContainerStarted","Data":"3bfa36b9236a3b7f7fea4cd454e8e0192b64db37a8cd11d33836efaef3df6216"} Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.274856 5094 scope.go:117] "RemoveContainer" containerID="ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" Feb 20 07:03:58 crc kubenswrapper[5094]: E0220 07:03:58.275528 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251\": container with ID starting with ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251 not found: ID does not exist" containerID="ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.275576 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251"} err="failed to get container status \"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251\": rpc error: code = NotFound desc = could not find container \"ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251\": container with ID starting with ad4ce4677efdfcf754c1cbcc9ee5621ccc1cb4030f623f01b62bbf2b0e05c251 not found: ID does not exist" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.275522 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-72vrj" podStartSLOduration=2.861411185 podStartE2EDuration="3.275497483s" podCreationTimestamp="2026-02-20 07:03:55 +0000 UTC" firstStartedPulling="2026-02-20 07:03:56.866028307 +0000 UTC m=+1051.738655038" lastFinishedPulling="2026-02-20 07:03:57.280114585 +0000 UTC m=+1052.152741336" observedRunningTime="2026-02-20 07:03:58.271090267 +0000 UTC m=+1053.143716978" watchObservedRunningTime="2026-02-20 07:03:58.275497483 +0000 UTC m=+1053.148124194" Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.290096 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:58 crc kubenswrapper[5094]: I0220 07:03:58.295831 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nmt6q"] Feb 20 07:03:59 crc kubenswrapper[5094]: I0220 07:03:59.866898 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" path="/var/lib/kubelet/pods/84ef4848-2558-4cf8-bac6-2f5ed78a74af/volumes" Feb 20 07:04:04 crc kubenswrapper[5094]: I0220 07:04:04.108321 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:04:04 crc kubenswrapper[5094]: I0220 07:04:04.109132 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:04:06 crc kubenswrapper[5094]: I0220 07:04:06.268246 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:04:06 crc kubenswrapper[5094]: I0220 07:04:06.268761 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:04:06 crc kubenswrapper[5094]: I0220 07:04:06.324898 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:04:07 crc kubenswrapper[5094]: I0220 07:04:07.376779 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-72vrj" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.182406 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9"] Feb 20 07:04:08 crc kubenswrapper[5094]: E0220 07:04:08.183341 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerName="registry-server" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.183365 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerName="registry-server" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.183592 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="84ef4848-2558-4cf8-bac6-2f5ed78a74af" containerName="registry-server" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.185381 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.188531 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-djg5h" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.211623 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9"] Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.376251 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.376407 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.376554 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.479028 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.479147 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.479297 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.480193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.480334 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.527105 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") pod \"8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:08 crc kubenswrapper[5094]: I0220 07:04:08.545988 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:09 crc kubenswrapper[5094]: I0220 07:04:09.113094 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9"] Feb 20 07:04:09 crc kubenswrapper[5094]: I0220 07:04:09.353840 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerStarted","Data":"1dbf16311d7a0ac4186a4a3270744782a0fa09c450c7a74b1f6b9e7716c6c9fa"} Feb 20 07:04:09 crc kubenswrapper[5094]: I0220 07:04:09.354276 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerStarted","Data":"5bb6e1a7942b18183fdd9a00f5713cbb7a86224eb665594b5faadb12cd12d191"} Feb 20 07:04:10 crc kubenswrapper[5094]: I0220 07:04:10.368369 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerID="1dbf16311d7a0ac4186a4a3270744782a0fa09c450c7a74b1f6b9e7716c6c9fa" exitCode=0 Feb 20 07:04:10 crc kubenswrapper[5094]: I0220 07:04:10.368514 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerDied","Data":"1dbf16311d7a0ac4186a4a3270744782a0fa09c450c7a74b1f6b9e7716c6c9fa"} Feb 20 07:04:11 crc kubenswrapper[5094]: I0220 07:04:11.380765 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerID="f70005987e48bddefbb3bf68ca3f16bff1c9e7d0a808b97397a6f4f05c1592a2" exitCode=0 Feb 20 07:04:11 crc kubenswrapper[5094]: I0220 07:04:11.380895 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerDied","Data":"f70005987e48bddefbb3bf68ca3f16bff1c9e7d0a808b97397a6f4f05c1592a2"} Feb 20 07:04:12 crc kubenswrapper[5094]: I0220 07:04:12.397199 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerID="6daec5c360406e0c342eeb7a99717114607f5233e49dd5dcd337f4f5cf56d753" exitCode=0 Feb 20 07:04:12 crc kubenswrapper[5094]: I0220 07:04:12.397279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerDied","Data":"6daec5c360406e0c342eeb7a99717114607f5233e49dd5dcd337f4f5cf56d753"} Feb 20 07:04:13 crc kubenswrapper[5094]: I0220 07:04:13.807988 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.002987 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") pod \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.003128 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") pod \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.003249 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") pod \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\" (UID: \"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8\") " Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.004662 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle" (OuterVolumeSpecName: "bundle") pod "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" (UID: "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.013611 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr" (OuterVolumeSpecName: "kube-api-access-9swrr") pod "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" (UID: "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8"). InnerVolumeSpecName "kube-api-access-9swrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.034617 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util" (OuterVolumeSpecName: "util") pod "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" (UID: "9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.105376 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9swrr\" (UniqueName: \"kubernetes.io/projected/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-kube-api-access-9swrr\") on node \"crc\" DevicePath \"\"" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.105421 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.105435 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8-util\") on node \"crc\" DevicePath \"\"" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.417496 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.417485 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9" event={"ID":"9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8","Type":"ContainerDied","Data":"5bb6e1a7942b18183fdd9a00f5713cbb7a86224eb665594b5faadb12cd12d191"} Feb 20 07:04:14 crc kubenswrapper[5094]: I0220 07:04:14.417664 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bb6e1a7942b18183fdd9a00f5713cbb7a86224eb665594b5faadb12cd12d191" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.632185 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw"] Feb 20 07:04:19 crc kubenswrapper[5094]: E0220 07:04:19.632935 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="extract" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.632950 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="extract" Feb 20 07:04:19 crc kubenswrapper[5094]: E0220 07:04:19.632963 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="util" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.632970 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="util" Feb 20 07:04:19 crc kubenswrapper[5094]: E0220 07:04:19.632987 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="pull" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.632992 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="pull" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.633125 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8" containerName="extract" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.633624 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.637062 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-7wwrf" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.675914 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw"] Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.809937 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t982r\" (UniqueName: \"kubernetes.io/projected/234632e4-6191-4ec8-94c5-c93d71c13ad0-kube-api-access-t982r\") pod \"openstack-operator-controller-init-6679bf9b57-9glnw\" (UID: \"234632e4-6191-4ec8-94c5-c93d71c13ad0\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.911516 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t982r\" (UniqueName: \"kubernetes.io/projected/234632e4-6191-4ec8-94c5-c93d71c13ad0-kube-api-access-t982r\") pod \"openstack-operator-controller-init-6679bf9b57-9glnw\" (UID: \"234632e4-6191-4ec8-94c5-c93d71c13ad0\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.931780 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t982r\" (UniqueName: \"kubernetes.io/projected/234632e4-6191-4ec8-94c5-c93d71c13ad0-kube-api-access-t982r\") pod \"openstack-operator-controller-init-6679bf9b57-9glnw\" (UID: \"234632e4-6191-4ec8-94c5-c93d71c13ad0\") " pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:19 crc kubenswrapper[5094]: I0220 07:04:19.957409 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:20 crc kubenswrapper[5094]: I0220 07:04:20.207764 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw"] Feb 20 07:04:20 crc kubenswrapper[5094]: I0220 07:04:20.463882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" event={"ID":"234632e4-6191-4ec8-94c5-c93d71c13ad0","Type":"ContainerStarted","Data":"d5053b001dd9f5b362508d148cc8d4e057cd56f7263bb415d58b44bd56fc15c0"} Feb 20 07:04:26 crc kubenswrapper[5094]: I0220 07:04:26.520477 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" event={"ID":"234632e4-6191-4ec8-94c5-c93d71c13ad0","Type":"ContainerStarted","Data":"d5069cfe9313c7bb2e8bde9749c8e36f2b3253e24164df4741ab0743381ff918"} Feb 20 07:04:26 crc kubenswrapper[5094]: I0220 07:04:26.521363 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:04:26 crc kubenswrapper[5094]: I0220 07:04:26.570217 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" podStartSLOduration=2.347067089 podStartE2EDuration="7.570105066s" podCreationTimestamp="2026-02-20 07:04:19 +0000 UTC" firstStartedPulling="2026-02-20 07:04:20.218815898 +0000 UTC m=+1075.091442609" lastFinishedPulling="2026-02-20 07:04:25.441853875 +0000 UTC m=+1080.314480586" observedRunningTime="2026-02-20 07:04:26.560185529 +0000 UTC m=+1081.432812240" watchObservedRunningTime="2026-02-20 07:04:26.570105066 +0000 UTC m=+1081.442731857" Feb 20 07:04:34 crc kubenswrapper[5094]: I0220 07:04:34.107494 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:04:34 crc kubenswrapper[5094]: I0220 07:04:34.108383 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:04:39 crc kubenswrapper[5094]: I0220 07:04:39.961487 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6679bf9b57-9glnw" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.148619 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.150457 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.154961 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.155580 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zj5lz" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.156003 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.157754 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-5qvsj" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.171521 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.178618 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.189499 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.190334 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.192752 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-2nzgh" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.202785 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-26vtn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.203778 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.207194 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-2nt96" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.212772 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.227779 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-26vtn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.234133 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4v8q\" (UniqueName: \"kubernetes.io/projected/6b09cc76-8cba-42ed-bb2c-fdf4473c9afe-kube-api-access-x4v8q\") pod \"barbican-operator-controller-manager-868647ff47-k5dkn\" (UID: \"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.234214 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw7ck\" (UniqueName: \"kubernetes.io/projected/a91a9b82-fc6b-4900-becb-6dc3c100e429-kube-api-access-kw7ck\") pod \"designate-operator-controller-manager-6d8bf5c495-xjng5\" (UID: \"a91a9b82-fc6b-4900-becb-6dc3c100e429\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.234237 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xhkb\" (UniqueName: \"kubernetes.io/projected/f6c8e20e-ecca-42d4-9e0e-5547ae567d9f-kube-api-access-5xhkb\") pod \"glance-operator-controller-manager-77987464f4-26vtn\" (UID: \"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.234261 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnh9x\" (UniqueName: \"kubernetes.io/projected/32338b54-c33f-4dc5-b328-9cf4d92d1db6-kube-api-access-hnh9x\") pod \"cinder-operator-controller-manager-5d946d989d-24cv7\" (UID: \"32338b54-c33f-4dc5-b328-9cf4d92d1db6\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.250755 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-p689m"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.251758 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.253563 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-47cqd" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.264044 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-p689m"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.295320 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.296388 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.300123 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9l9lb" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.309944 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.310968 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.316016 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.316785 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-twhdx" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338857 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw9kw\" (UniqueName: \"kubernetes.io/projected/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-kube-api-access-bw9kw\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbsk2\" (UniqueName: \"kubernetes.io/projected/36d60210-52d5-4f28-ae0b-28cce632d5cb-kube-api-access-xbsk2\") pod \"heat-operator-controller-manager-69f49c598c-p689m\" (UID: \"36d60210-52d5-4f28-ae0b-28cce632d5cb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338929 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr97x\" (UniqueName: \"kubernetes.io/projected/292cb132-b03c-4d20-8bee-c90ad3c4486b-kube-api-access-rr97x\") pod \"horizon-operator-controller-manager-5b9b8895d5-mnd7v\" (UID: \"292cb132-b03c-4d20-8bee-c90ad3c4486b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338946 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.338995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4v8q\" (UniqueName: \"kubernetes.io/projected/6b09cc76-8cba-42ed-bb2c-fdf4473c9afe-kube-api-access-x4v8q\") pod \"barbican-operator-controller-manager-868647ff47-k5dkn\" (UID: \"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.339035 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw7ck\" (UniqueName: \"kubernetes.io/projected/a91a9b82-fc6b-4900-becb-6dc3c100e429-kube-api-access-kw7ck\") pod \"designate-operator-controller-manager-6d8bf5c495-xjng5\" (UID: \"a91a9b82-fc6b-4900-becb-6dc3c100e429\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.339056 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xhkb\" (UniqueName: \"kubernetes.io/projected/f6c8e20e-ecca-42d4-9e0e-5547ae567d9f-kube-api-access-5xhkb\") pod \"glance-operator-controller-manager-77987464f4-26vtn\" (UID: \"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.339078 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnh9x\" (UniqueName: \"kubernetes.io/projected/32338b54-c33f-4dc5-b328-9cf4d92d1db6-kube-api-access-hnh9x\") pod \"cinder-operator-controller-manager-5d946d989d-24cv7\" (UID: \"32338b54-c33f-4dc5-b328-9cf4d92d1db6\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.359068 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.405490 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.412833 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnh9x\" (UniqueName: \"kubernetes.io/projected/32338b54-c33f-4dc5-b328-9cf4d92d1db6-kube-api-access-hnh9x\") pod \"cinder-operator-controller-manager-5d946d989d-24cv7\" (UID: \"32338b54-c33f-4dc5-b328-9cf4d92d1db6\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.417025 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xhkb\" (UniqueName: \"kubernetes.io/projected/f6c8e20e-ecca-42d4-9e0e-5547ae567d9f-kube-api-access-5xhkb\") pod \"glance-operator-controller-manager-77987464f4-26vtn\" (UID: \"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.422118 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.423912 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.428294 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-lghsf" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.429418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw7ck\" (UniqueName: \"kubernetes.io/projected/a91a9b82-fc6b-4900-becb-6dc3c100e429-kube-api-access-kw7ck\") pod \"designate-operator-controller-manager-6d8bf5c495-xjng5\" (UID: \"a91a9b82-fc6b-4900-becb-6dc3c100e429\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.455164 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4v8q\" (UniqueName: \"kubernetes.io/projected/6b09cc76-8cba-42ed-bb2c-fdf4473c9afe-kube-api-access-x4v8q\") pod \"barbican-operator-controller-manager-868647ff47-k5dkn\" (UID: \"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.456850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25cw9\" (UniqueName: \"kubernetes.io/projected/fcf15128-56ef-42dc-b230-1cd8b7638d33-kube-api-access-25cw9\") pod \"ironic-operator-controller-manager-554564d7fc-8j8pv\" (UID: \"fcf15128-56ef-42dc-b230-1cd8b7638d33\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.456927 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.457183 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw9kw\" (UniqueName: \"kubernetes.io/projected/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-kube-api-access-bw9kw\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.457225 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbsk2\" (UniqueName: \"kubernetes.io/projected/36d60210-52d5-4f28-ae0b-28cce632d5cb-kube-api-access-xbsk2\") pod \"heat-operator-controller-manager-69f49c598c-p689m\" (UID: \"36d60210-52d5-4f28-ae0b-28cce632d5cb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.457491 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.457539 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr97x\" (UniqueName: \"kubernetes.io/projected/292cb132-b03c-4d20-8bee-c90ad3c4486b-kube-api-access-rr97x\") pod \"horizon-operator-controller-manager-5b9b8895d5-mnd7v\" (UID: \"292cb132-b03c-4d20-8bee-c90ad3c4486b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: E0220 07:05:00.457780 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:00 crc kubenswrapper[5094]: E0220 07:05:00.457856 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:00.95783095 +0000 UTC m=+1115.830457661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.474666 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.475379 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.477832 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-mc85t" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.488274 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.508382 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.514612 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr97x\" (UniqueName: \"kubernetes.io/projected/292cb132-b03c-4d20-8bee-c90ad3c4486b-kube-api-access-rr97x\") pod \"horizon-operator-controller-manager-5b9b8895d5-mnd7v\" (UID: \"292cb132-b03c-4d20-8bee-c90ad3c4486b\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.515215 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbsk2\" (UniqueName: \"kubernetes.io/projected/36d60210-52d5-4f28-ae0b-28cce632d5cb-kube-api-access-xbsk2\") pod \"heat-operator-controller-manager-69f49c598c-p689m\" (UID: \"36d60210-52d5-4f28-ae0b-28cce632d5cb\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.521769 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.528564 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.529413 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.535507 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw9kw\" (UniqueName: \"kubernetes.io/projected/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-kube-api-access-bw9kw\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.573837 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgzj5\" (UniqueName: \"kubernetes.io/projected/8a1c02cd-3546-45fa-b7db-5903c80681a4-kube-api-access-tgzj5\") pod \"keystone-operator-controller-manager-b4d948c87-86bxl\" (UID: \"8a1c02cd-3546-45fa-b7db-5903c80681a4\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.574057 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.580115 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25cw9\" (UniqueName: \"kubernetes.io/projected/fcf15128-56ef-42dc-b230-1cd8b7638d33-kube-api-access-25cw9\") pod \"ironic-operator-controller-manager-554564d7fc-8j8pv\" (UID: \"fcf15128-56ef-42dc-b230-1cd8b7638d33\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.589192 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.590336 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.615613 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-swrx9" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.616062 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.632230 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25cw9\" (UniqueName: \"kubernetes.io/projected/fcf15128-56ef-42dc-b230-1cd8b7638d33-kube-api-access-25cw9\") pod \"ironic-operator-controller-manager-554564d7fc-8j8pv\" (UID: \"fcf15128-56ef-42dc-b230-1cd8b7638d33\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.632307 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.633271 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.641740 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-wmcg8" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.681315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cc8w\" (UniqueName: \"kubernetes.io/projected/b863d4f9-063a-4102-8c3d-f7e092e4e2c0-kube-api-access-7cc8w\") pod \"manila-operator-controller-manager-54f6768c69-hfkff\" (UID: \"b863d4f9-063a-4102-8c3d-f7e092e4e2c0\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.681391 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgzj5\" (UniqueName: \"kubernetes.io/projected/8a1c02cd-3546-45fa-b7db-5903c80681a4-kube-api-access-tgzj5\") pod \"keystone-operator-controller-manager-b4d948c87-86bxl\" (UID: \"8a1c02cd-3546-45fa-b7db-5903c80681a4\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.681428 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsjg7\" (UniqueName: \"kubernetes.io/projected/6177108e-bc02-497c-80ab-312f61fbd1c2-kube-api-access-hsjg7\") pod \"mariadb-operator-controller-manager-6994f66f48-n5dgn\" (UID: \"6177108e-bc02-497c-80ab-312f61fbd1c2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.701807 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.707227 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgzj5\" (UniqueName: \"kubernetes.io/projected/8a1c02cd-3546-45fa-b7db-5903c80681a4-kube-api-access-tgzj5\") pod \"keystone-operator-controller-manager-b4d948c87-86bxl\" (UID: \"8a1c02cd-3546-45fa-b7db-5903c80681a4\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.743994 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.785820 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsjg7\" (UniqueName: \"kubernetes.io/projected/6177108e-bc02-497c-80ab-312f61fbd1c2-kube-api-access-hsjg7\") pod \"mariadb-operator-controller-manager-6994f66f48-n5dgn\" (UID: \"6177108e-bc02-497c-80ab-312f61fbd1c2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.785928 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cc8w\" (UniqueName: \"kubernetes.io/projected/b863d4f9-063a-4102-8c3d-f7e092e4e2c0-kube-api-access-7cc8w\") pod \"manila-operator-controller-manager-54f6768c69-hfkff\" (UID: \"b863d4f9-063a-4102-8c3d-f7e092e4e2c0\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.786776 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.787849 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.788494 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.789165 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.824075 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.824585 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8zgn5" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.825128 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7bhrk" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.861770 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.886982 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rc9s\" (UniqueName: \"kubernetes.io/projected/93dbc041-00c2-4189-abca-6bb3a00abc2d-kube-api-access-6rc9s\") pod \"nova-operator-controller-manager-567668f5cf-2ftdz\" (UID: \"93dbc041-00c2-4189-abca-6bb3a00abc2d\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.887549 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhj9q\" (UniqueName: \"kubernetes.io/projected/eba1b8e0-b529-47ad-a657-75ce01bad56a-kube-api-access-bhj9q\") pod \"neutron-operator-controller-manager-64ddbf8bb-bd9tr\" (UID: \"eba1b8e0-b529-47ad-a657-75ce01bad56a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.888656 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsjg7\" (UniqueName: \"kubernetes.io/projected/6177108e-bc02-497c-80ab-312f61fbd1c2-kube-api-access-hsjg7\") pod \"mariadb-operator-controller-manager-6994f66f48-n5dgn\" (UID: \"6177108e-bc02-497c-80ab-312f61fbd1c2\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.893690 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cc8w\" (UniqueName: \"kubernetes.io/projected/b863d4f9-063a-4102-8c3d-f7e092e4e2c0-kube-api-access-7cc8w\") pod \"manila-operator-controller-manager-54f6768c69-hfkff\" (UID: \"b863d4f9-063a-4102-8c3d-f7e092e4e2c0\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.915412 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.919868 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.922426 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.925320 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-psclz" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.936791 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.960970 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.962098 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.962152 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.966055 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-kkvr8" Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.987000 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7"] Feb 20 07:05:00 crc kubenswrapper[5094]: I0220 07:05:00.989643 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:00.999454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhj9q\" (UniqueName: \"kubernetes.io/projected/eba1b8e0-b529-47ad-a657-75ce01bad56a-kube-api-access-bhj9q\") pod \"neutron-operator-controller-manager-64ddbf8bb-bd9tr\" (UID: \"eba1b8e0-b529-47ad-a657-75ce01bad56a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:00.999537 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rc9s\" (UniqueName: \"kubernetes.io/projected/93dbc041-00c2-4189-abca-6bb3a00abc2d-kube-api-access-6rc9s\") pod \"nova-operator-controller-manager-567668f5cf-2ftdz\" (UID: \"93dbc041-00c2-4189-abca-6bb3a00abc2d\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:00.999579 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:00.999730 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk9bv\" (UniqueName: \"kubernetes.io/projected/683351ac-f508-4961-b07a-eaac9c26a4f3-kube-api-access-pk9bv\") pod \"octavia-operator-controller-manager-69f8888797-ngkcq\" (UID: \"683351ac-f508-4961-b07a-eaac9c26a4f3\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.000229 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.005834 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:02.005805184 +0000 UTC m=+1116.878431895 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.018785 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.019866 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.019893 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.022416 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.023192 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-j64q2" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.026740 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rc9s\" (UniqueName: \"kubernetes.io/projected/93dbc041-00c2-4189-abca-6bb3a00abc2d-kube-api-access-6rc9s\") pod \"nova-operator-controller-manager-567668f5cf-2ftdz\" (UID: \"93dbc041-00c2-4189-abca-6bb3a00abc2d\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.028183 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhj9q\" (UniqueName: \"kubernetes.io/projected/eba1b8e0-b529-47ad-a657-75ce01bad56a-kube-api-access-bhj9q\") pod \"neutron-operator-controller-manager-64ddbf8bb-bd9tr\" (UID: \"eba1b8e0-b529-47ad-a657-75ce01bad56a\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.037895 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.045955 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.052808 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-kkw69" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.063049 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dh48q"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.064121 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.068400 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kxv5p" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.073246 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.086234 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.091146 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.092394 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.100348 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-f6gzp" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108358 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk9bv\" (UniqueName: \"kubernetes.io/projected/683351ac-f508-4961-b07a-eaac9c26a4f3-kube-api-access-pk9bv\") pod \"octavia-operator-controller-manager-69f8888797-ngkcq\" (UID: \"683351ac-f508-4961-b07a-eaac9c26a4f3\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108421 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108457 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg9gs\" (UniqueName: \"kubernetes.io/projected/74861845-de37-4091-9226-bcb1bbe64b35-kube-api-access-pg9gs\") pod \"ovn-operator-controller-manager-d44cf6b75-ct2h7\" (UID: \"74861845-de37-4091-9226-bcb1bbe64b35\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108494 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv4q2\" (UniqueName: \"kubernetes.io/projected/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-kube-api-access-fv4q2\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.108526 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rws2w\" (UniqueName: \"kubernetes.io/projected/c510ecc1-53ce-4611-af6a-09488f9317ed-kube-api-access-rws2w\") pod \"placement-operator-controller-manager-8497b45c89-9c2k5\" (UID: \"c510ecc1-53ce-4611-af6a-09488f9317ed\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.130878 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk9bv\" (UniqueName: \"kubernetes.io/projected/683351ac-f508-4961-b07a-eaac9c26a4f3-kube-api-access-pk9bv\") pod \"octavia-operator-controller-manager-69f8888797-ngkcq\" (UID: \"683351ac-f508-4961-b07a-eaac9c26a4f3\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.131449 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dh48q"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.139881 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.199940 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-nll74"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.206836 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.213029 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8hrbp" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222380 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222569 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg9gs\" (UniqueName: \"kubernetes.io/projected/74861845-de37-4091-9226-bcb1bbe64b35-kube-api-access-pg9gs\") pod \"ovn-operator-controller-manager-d44cf6b75-ct2h7\" (UID: \"74861845-de37-4091-9226-bcb1bbe64b35\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv4q2\" (UniqueName: \"kubernetes.io/projected/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-kube-api-access-fv4q2\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222837 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zldt2\" (UniqueName: \"kubernetes.io/projected/a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5-kube-api-access-zldt2\") pod \"swift-operator-controller-manager-68f46476f-dh48q\" (UID: \"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222869 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptz6g\" (UniqueName: \"kubernetes.io/projected/c45dcc1f-a95d-4492-9139-16d550809a8e-kube-api-access-ptz6g\") pod \"telemetry-operator-controller-manager-7f45b4ff68-cbtkd\" (UID: \"c45dcc1f-a95d-4492-9139-16d550809a8e\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.222903 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rws2w\" (UniqueName: \"kubernetes.io/projected/c510ecc1-53ce-4611-af6a-09488f9317ed-kube-api-access-rws2w\") pod \"placement-operator-controller-manager-8497b45c89-9c2k5\" (UID: \"c510ecc1-53ce-4611-af6a-09488f9317ed\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.222932 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.223021 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:01.722994596 +0000 UTC m=+1116.595621307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.233354 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-nll74"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.246001 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv4q2\" (UniqueName: \"kubernetes.io/projected/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-kube-api-access-fv4q2\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.246747 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rws2w\" (UniqueName: \"kubernetes.io/projected/c510ecc1-53ce-4611-af6a-09488f9317ed-kube-api-access-rws2w\") pod \"placement-operator-controller-manager-8497b45c89-9c2k5\" (UID: \"c510ecc1-53ce-4611-af6a-09488f9317ed\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.247826 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg9gs\" (UniqueName: \"kubernetes.io/projected/74861845-de37-4091-9226-bcb1bbe64b35-kube-api-access-pg9gs\") pod \"ovn-operator-controller-manager-d44cf6b75-ct2h7\" (UID: \"74861845-de37-4091-9226-bcb1bbe64b35\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.256492 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.258831 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.261232 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.265886 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-ms752" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.271864 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.279551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.301426 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.301453 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.303162 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.307220 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.307503 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-prrqj" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.307675 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.309779 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.321234 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.321881 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.325458 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ckl9\" (UniqueName: \"kubernetes.io/projected/4413dc36-58b0-447a-ba69-cdd2cee9589c-kube-api-access-2ckl9\") pod \"watcher-operator-controller-manager-5db88f68c-lz57q\" (UID: \"4413dc36-58b0-447a-ba69-cdd2cee9589c\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.325549 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zldt2\" (UniqueName: \"kubernetes.io/projected/a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5-kube-api-access-zldt2\") pod \"swift-operator-controller-manager-68f46476f-dh48q\" (UID: \"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.325579 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49rgc\" (UniqueName: \"kubernetes.io/projected/fce6c9b3-2075-479d-9a16-738831a871c4-kube-api-access-49rgc\") pod \"test-operator-controller-manager-7866795846-nll74\" (UID: \"fce6c9b3-2075-479d-9a16-738831a871c4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.325603 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptz6g\" (UniqueName: \"kubernetes.io/projected/c45dcc1f-a95d-4492-9139-16d550809a8e-kube-api-access-ptz6g\") pod \"telemetry-operator-controller-manager-7f45b4ff68-cbtkd\" (UID: \"c45dcc1f-a95d-4492-9139-16d550809a8e\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.330628 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.345037 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.351317 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-7kj87" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.357470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zldt2\" (UniqueName: \"kubernetes.io/projected/a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5-kube-api-access-zldt2\") pod \"swift-operator-controller-manager-68f46476f-dh48q\" (UID: \"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.363624 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.369925 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptz6g\" (UniqueName: \"kubernetes.io/projected/c45dcc1f-a95d-4492-9139-16d550809a8e-kube-api-access-ptz6g\") pod \"telemetry-operator-controller-manager-7f45b4ff68-cbtkd\" (UID: \"c45dcc1f-a95d-4492-9139-16d550809a8e\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.382096 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.391887 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.416685 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.425887 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428327 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428697 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxl72\" (UniqueName: \"kubernetes.io/projected/f45a4211-8890-4e4a-af96-ccffec62160c-kube-api-access-fxl72\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fq9n6\" (UID: \"f45a4211-8890-4e4a-af96-ccffec62160c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428845 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78z96\" (UniqueName: \"kubernetes.io/projected/a1b74404-906b-4466-a3bd-289458ef90ea-kube-api-access-78z96\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428944 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ckl9\" (UniqueName: \"kubernetes.io/projected/4413dc36-58b0-447a-ba69-cdd2cee9589c-kube-api-access-2ckl9\") pod \"watcher-operator-controller-manager-5db88f68c-lz57q\" (UID: \"4413dc36-58b0-447a-ba69-cdd2cee9589c\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.428995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49rgc\" (UniqueName: \"kubernetes.io/projected/fce6c9b3-2075-479d-9a16-738831a871c4-kube-api-access-49rgc\") pod \"test-operator-controller-manager-7866795846-nll74\" (UID: \"fce6c9b3-2075-479d-9a16-738831a871c4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.470375 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49rgc\" (UniqueName: \"kubernetes.io/projected/fce6c9b3-2075-479d-9a16-738831a871c4-kube-api-access-49rgc\") pod \"test-operator-controller-manager-7866795846-nll74\" (UID: \"fce6c9b3-2075-479d-9a16-738831a871c4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.478925 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ckl9\" (UniqueName: \"kubernetes.io/projected/4413dc36-58b0-447a-ba69-cdd2cee9589c-kube-api-access-2ckl9\") pod \"watcher-operator-controller-manager-5db88f68c-lz57q\" (UID: \"4413dc36-58b0-447a-ba69-cdd2cee9589c\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.513357 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7"] Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.530815 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78z96\" (UniqueName: \"kubernetes.io/projected/a1b74404-906b-4466-a3bd-289458ef90ea-kube-api-access-78z96\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.530934 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.530974 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxl72\" (UniqueName: \"kubernetes.io/projected/f45a4211-8890-4e4a-af96-ccffec62160c-kube-api-access-fxl72\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fq9n6\" (UID: \"f45a4211-8890-4e4a-af96-ccffec62160c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.531003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.531143 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.531212 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:02.031194117 +0000 UTC m=+1116.903820828 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.531401 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.531469 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:02.031448223 +0000 UTC m=+1116.904074934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.546099 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.550577 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxl72\" (UniqueName: \"kubernetes.io/projected/f45a4211-8890-4e4a-af96-ccffec62160c-kube-api-access-fxl72\") pod \"rabbitmq-cluster-operator-manager-668c99d594-fq9n6\" (UID: \"f45a4211-8890-4e4a-af96-ccffec62160c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.552622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78z96\" (UniqueName: \"kubernetes.io/projected/a1b74404-906b-4466-a3bd-289458ef90ea-kube-api-access-78z96\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:01 crc kubenswrapper[5094]: W0220 07:05:01.594592 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32338b54_c33f_4dc5_b328_9cf4d92d1db6.slice/crio-ac144efc2d6e85620522f37359c351e382952854b83910c1f79ae4cbbdfdecd1 WatchSource:0}: Error finding container ac144efc2d6e85620522f37359c351e382952854b83910c1f79ae4cbbdfdecd1: Status 404 returned error can't find the container with id ac144efc2d6e85620522f37359c351e382952854b83910c1f79ae4cbbdfdecd1 Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.599595 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.672753 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.674631 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-26vtn"] Feb 20 07:05:01 crc kubenswrapper[5094]: W0220 07:05:01.689293 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c8e20e_ecca_42d4_9e0e_5547ae567d9f.slice/crio-e5dd85e8d7a6343df7ca59cedf2d67210e1591793b5c752aed6e28e07c6e852b WatchSource:0}: Error finding container e5dd85e8d7a6343df7ca59cedf2d67210e1591793b5c752aed6e28e07c6e852b: Status 404 returned error can't find the container with id e5dd85e8d7a6343df7ca59cedf2d67210e1591793b5c752aed6e28e07c6e852b Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.694791 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v"] Feb 20 07:05:01 crc kubenswrapper[5094]: W0220 07:05:01.703200 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod292cb132_b03c_4d20_8bee_c90ad3c4486b.slice/crio-a99ad4d19cd3b8825b8454b38f2efbf88ab56b3ed7d4277a43ed743b03428a76 WatchSource:0}: Error finding container a99ad4d19cd3b8825b8454b38f2efbf88ab56b3ed7d4277a43ed743b03428a76: Status 404 returned error can't find the container with id a99ad4d19cd3b8825b8454b38f2efbf88ab56b3ed7d4277a43ed743b03428a76 Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.735726 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.736031 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: E0220 07:05:01.736098 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:02.736077194 +0000 UTC m=+1117.608703905 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.852680 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" event={"ID":"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f","Type":"ContainerStarted","Data":"e5dd85e8d7a6343df7ca59cedf2d67210e1591793b5c752aed6e28e07c6e852b"} Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.853888 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" event={"ID":"292cb132-b03c-4d20-8bee-c90ad3c4486b","Type":"ContainerStarted","Data":"a99ad4d19cd3b8825b8454b38f2efbf88ab56b3ed7d4277a43ed743b03428a76"} Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.854573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" event={"ID":"32338b54-c33f-4dc5-b328-9cf4d92d1db6","Type":"ContainerStarted","Data":"ac144efc2d6e85620522f37359c351e382952854b83910c1f79ae4cbbdfdecd1"} Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.856217 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" event={"ID":"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe","Type":"ContainerStarted","Data":"2546da07e9828a80836e443d394ea45756a6da690df77f82e48dffa15b8a38fb"} Feb 20 07:05:01 crc kubenswrapper[5094]: I0220 07:05:01.989680 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.013842 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.030430 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.040925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.040984 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.041052 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041134 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041191 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041236 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:04.041209211 +0000 UTC m=+1118.913835922 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041276 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:03.041246382 +0000 UTC m=+1117.913873093 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041326 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.041419 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:03.041393056 +0000 UTC m=+1117.914019957 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: W0220 07:05:02.041967 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf15128_56ef_42dc_b230_1cd8b7638d33.slice/crio-c27cc197ae3cfb8322a01660c7caa68cb61067b0047361c994e7c67c15520de9 WatchSource:0}: Error finding container c27cc197ae3cfb8322a01660c7caa68cb61067b0047361c994e7c67c15520de9: Status 404 returned error can't find the container with id c27cc197ae3cfb8322a01660c7caa68cb61067b0047361c994e7c67c15520de9 Feb 20 07:05:02 crc kubenswrapper[5094]: W0220 07:05:02.042214 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a1c02cd_3546_45fa_b7db_5903c80681a4.slice/crio-94e198a52bae9685c7e612a6273059e65100e87d1b3365d57754e26fe56218a8 WatchSource:0}: Error finding container 94e198a52bae9685c7e612a6273059e65100e87d1b3365d57754e26fe56218a8: Status 404 returned error can't find the container with id 94e198a52bae9685c7e612a6273059e65100e87d1b3365d57754e26fe56218a8 Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.052175 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-p689m"] Feb 20 07:05:02 crc kubenswrapper[5094]: W0220 07:05:02.055662 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6177108e_bc02_497c_80ab_312f61fbd1c2.slice/crio-e2937e5f8d3d26a5dbf243abc30abcfe4d7c4d0103f4a75837e8a78d25776177 WatchSource:0}: Error finding container e2937e5f8d3d26a5dbf243abc30abcfe4d7c4d0103f4a75837e8a78d25776177: Status 404 returned error can't find the container with id e2937e5f8d3d26a5dbf243abc30abcfe4d7c4d0103f4a75837e8a78d25776177 Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.072606 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.078949 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.083248 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.087065 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.110086 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.219026 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.231428 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7"] Feb 20 07:05:02 crc kubenswrapper[5094]: W0220 07:05:02.239317 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0d3c29b_4f57_4647_b1a5_bfd6c887b0b5.slice/crio-32f946e4607a39a8e0dfa954eeaa0f471004b2b97208dfe5bb5437461ccef7a4 WatchSource:0}: Error finding container 32f946e4607a39a8e0dfa954eeaa0f471004b2b97208dfe5bb5437461ccef7a4: Status 404 returned error can't find the container with id 32f946e4607a39a8e0dfa954eeaa0f471004b2b97208dfe5bb5437461ccef7a4 Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.240579 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5"] Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.240774 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pg9gs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-ct2h7_openstack-operators(74861845-de37-4091-9226-bcb1bbe64b35): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.241945 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" podUID="74861845-de37-4091-9226-bcb1bbe64b35" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.242572 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2ckl9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-lz57q_openstack-operators(4413dc36-58b0-447a-ba69-cdd2cee9589c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.242803 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zldt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-dh48q_openstack-operators(a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.243848 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" podUID="a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.243888 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" podUID="4413dc36-58b0-447a-ba69-cdd2cee9589c" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.246267 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-dh48q"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.385596 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-nll74"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.393363 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd"] Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.398036 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6"] Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.406123 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-49rgc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-nll74_openstack-operators(fce6c9b3-2075-479d-9a16-738831a871c4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.407279 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" podUID="fce6c9b3-2075-479d-9a16-738831a871c4" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.433597 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fxl72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-fq9n6_openstack-operators(f45a4211-8890-4e4a-af96-ccffec62160c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.435495 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" podUID="f45a4211-8890-4e4a-af96-ccffec62160c" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.753792 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.753999 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.754055 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:04.754039843 +0000 UTC m=+1119.626666554 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.874164 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" event={"ID":"8a1c02cd-3546-45fa-b7db-5903c80681a4","Type":"ContainerStarted","Data":"94e198a52bae9685c7e612a6273059e65100e87d1b3365d57754e26fe56218a8"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.876660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" event={"ID":"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5","Type":"ContainerStarted","Data":"32f946e4607a39a8e0dfa954eeaa0f471004b2b97208dfe5bb5437461ccef7a4"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.879764 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" podUID="a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.883630 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" event={"ID":"74861845-de37-4091-9226-bcb1bbe64b35","Type":"ContainerStarted","Data":"1226a272d4576b1c0f5b17f0133b715d6616e04e7a6af25043e09c3a5535b3fa"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.885132 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" podUID="74861845-de37-4091-9226-bcb1bbe64b35" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.888206 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" event={"ID":"fcf15128-56ef-42dc-b230-1cd8b7638d33","Type":"ContainerStarted","Data":"c27cc197ae3cfb8322a01660c7caa68cb61067b0047361c994e7c67c15520de9"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.901074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" event={"ID":"f45a4211-8890-4e4a-af96-ccffec62160c","Type":"ContainerStarted","Data":"977319233c272af4c112ca04768045fd464b4196ee3e93e05889715106d9f8d8"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.903496 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" podUID="f45a4211-8890-4e4a-af96-ccffec62160c" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.905431 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" event={"ID":"6177108e-bc02-497c-80ab-312f61fbd1c2","Type":"ContainerStarted","Data":"e2937e5f8d3d26a5dbf243abc30abcfe4d7c4d0103f4a75837e8a78d25776177"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.912865 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" event={"ID":"eba1b8e0-b529-47ad-a657-75ce01bad56a","Type":"ContainerStarted","Data":"f1543b3c2008b61d7a8c5098b0f18e5e652ea6483cc681db043f746c50f6e9e5"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.933846 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" event={"ID":"36d60210-52d5-4f28-ae0b-28cce632d5cb","Type":"ContainerStarted","Data":"ff6c2fb5b9eae6a01844bd0c8a44ddaaf03e5f4cc39c426c041a41154f90d4bd"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.935661 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" event={"ID":"c45dcc1f-a95d-4492-9139-16d550809a8e","Type":"ContainerStarted","Data":"dbd2eb3298c5c979ede34dd908ae0c73cf9b7f02ff7d949ed98293cd33f68023"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.937208 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" event={"ID":"4413dc36-58b0-447a-ba69-cdd2cee9589c","Type":"ContainerStarted","Data":"8fcedb8f6c462ae8e7b5be314a0fd227e808595a19ba58a9a54306c677684035"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.939982 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" podUID="4413dc36-58b0-447a-ba69-cdd2cee9589c" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.940887 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" event={"ID":"fce6c9b3-2075-479d-9a16-738831a871c4","Type":"ContainerStarted","Data":"cd111fcc519bebf20aba9638139d03fe4c850a8a3a4db049045f83d9be5cd50a"} Feb 20 07:05:02 crc kubenswrapper[5094]: E0220 07:05:02.946235 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" podUID="fce6c9b3-2075-479d-9a16-738831a871c4" Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.947467 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" event={"ID":"a91a9b82-fc6b-4900-becb-6dc3c100e429","Type":"ContainerStarted","Data":"945797c64dae2ce83c388ea43f46c0236748a7466110c6458b5981969d8e34ed"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.952610 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" event={"ID":"683351ac-f508-4961-b07a-eaac9c26a4f3","Type":"ContainerStarted","Data":"b110eae661d3734a215563cf14698a21d98de1f29eae56bda957525836393a10"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.954108 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" event={"ID":"93dbc041-00c2-4189-abca-6bb3a00abc2d","Type":"ContainerStarted","Data":"cf42301ed1a82263f83e2b5d3c2a0a708ac6d8e891bc7952422ccfc2e50fc430"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.973834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" event={"ID":"c510ecc1-53ce-4611-af6a-09488f9317ed","Type":"ContainerStarted","Data":"8f0fbeddce022b4d7e052550a161d07abfb4ca488977490ac278930c29fa26d0"} Feb 20 07:05:02 crc kubenswrapper[5094]: I0220 07:05:02.980773 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" event={"ID":"b863d4f9-063a-4102-8c3d-f7e092e4e2c0","Type":"ContainerStarted","Data":"b006f500e4680d825e2f399ca940bd6b93e1075ebea8faf060f34ba1f608f4f6"} Feb 20 07:05:03 crc kubenswrapper[5094]: I0220 07:05:03.058793 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:03 crc kubenswrapper[5094]: I0220 07:05:03.058889 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:03 crc kubenswrapper[5094]: E0220 07:05:03.059434 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:03 crc kubenswrapper[5094]: E0220 07:05:03.059506 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:05.059483508 +0000 UTC m=+1119.932110219 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:03 crc kubenswrapper[5094]: E0220 07:05:03.060874 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:03 crc kubenswrapper[5094]: E0220 07:05:03.060914 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:05.060903142 +0000 UTC m=+1119.933529853 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041250 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" podUID="fce6c9b3-2075-479d-9a16-738831a871c4" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041597 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" podUID="a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041351 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" podUID="f45a4211-8890-4e4a-af96-ccffec62160c" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041423 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" podUID="74861845-de37-4091-9226-bcb1bbe64b35" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.041538 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" podUID="4413dc36-58b0-447a-ba69-cdd2cee9589c" Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.082375 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.083864 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.083953 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:08.083931473 +0000 UTC m=+1122.956558184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.107534 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.107624 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.107688 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.112125 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.112216 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d" gracePeriod=600 Feb 20 07:05:04 crc kubenswrapper[5094]: I0220 07:05:04.802537 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.802898 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:04 crc kubenswrapper[5094]: E0220 07:05:04.803723 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:08.80367465 +0000 UTC m=+1123.676301361 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.070319 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d" exitCode=0 Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.070383 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d"} Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.070420 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7"} Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.070441 5094 scope.go:117] "RemoveContainer" containerID="8a6efb8b2f6985d19679a1508c25eeeb609faa457ea57f65ac79a9b48c9574e6" Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.123374 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:05 crc kubenswrapper[5094]: I0220 07:05:05.123681 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:05 crc kubenswrapper[5094]: E0220 07:05:05.123905 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:05 crc kubenswrapper[5094]: E0220 07:05:05.124061 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:09.124036923 +0000 UTC m=+1123.996663624 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:05 crc kubenswrapper[5094]: E0220 07:05:05.124056 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:05 crc kubenswrapper[5094]: E0220 07:05:05.124173 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:09.124146995 +0000 UTC m=+1123.996773706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:08 crc kubenswrapper[5094]: I0220 07:05:08.185338 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:08 crc kubenswrapper[5094]: E0220 07:05:08.185800 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:08 crc kubenswrapper[5094]: E0220 07:05:08.185971 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:16.185950293 +0000 UTC m=+1131.058577004 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:08 crc kubenswrapper[5094]: I0220 07:05:08.898383 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:08 crc kubenswrapper[5094]: E0220 07:05:08.899176 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:08 crc kubenswrapper[5094]: E0220 07:05:08.899283 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:16.899257986 +0000 UTC m=+1131.771884697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:09 crc kubenswrapper[5094]: I0220 07:05:09.208463 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:09 crc kubenswrapper[5094]: E0220 07:05:09.208722 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:09 crc kubenswrapper[5094]: I0220 07:05:09.208740 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:09 crc kubenswrapper[5094]: E0220 07:05:09.208835 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:17.20880525 +0000 UTC m=+1132.081431981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:09 crc kubenswrapper[5094]: E0220 07:05:09.208899 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:09 crc kubenswrapper[5094]: E0220 07:05:09.209051 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:17.209030105 +0000 UTC m=+1132.081656816 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.195440 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.196234 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5xhkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987464f4-26vtn_openstack-operators(f6c8e20e-ecca-42d4-9e0e-5547ae567d9f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.197447 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" podUID="f6c8e20e-ecca-42d4-9e0e-5547ae567d9f" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.891129 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.891424 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7cc8w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-54f6768c69-hfkff_openstack-operators(b863d4f9-063a-4102-8c3d-f7e092e4e2c0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:15 crc kubenswrapper[5094]: E0220 07:05:15.892829 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" podUID="b863d4f9-063a-4102-8c3d-f7e092e4e2c0" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.161910 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:1ab3ec59cd8e30dd8423e91ad832403bdefbae3b8ac47e15578d5a677d7ba0df\\\"\"" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" podUID="f6c8e20e-ecca-42d4-9e0e-5547ae567d9f" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.167163 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:8fb0a33b8d93cf9f84f079af5f2ceb680afada4e44542514959146779f57f64c\\\"\"" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" podUID="b863d4f9-063a-4102-8c3d-f7e092e4e2c0" Feb 20 07:05:16 crc kubenswrapper[5094]: I0220 07:05:16.270649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.270876 5094 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.270952 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert podName:eb67a9bc-35a6-4ce3-bca8-a08ee824cda7 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:32.270927081 +0000 UTC m=+1147.143553782 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert") pod "infra-operator-controller-manager-79d975b745-nxtc7" (UID: "eb67a9bc-35a6-4ce3-bca8-a08ee824cda7") : secret "infra-operator-webhook-server-cert" not found Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.665991 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.666460 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bhj9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64ddbf8bb-bd9tr_openstack-operators(eba1b8e0-b529-47ad-a657-75ce01bad56a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.668838 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" podUID="eba1b8e0-b529-47ad-a657-75ce01bad56a" Feb 20 07:05:16 crc kubenswrapper[5094]: I0220 07:05:16.985960 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.986166 5094 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:16 crc kubenswrapper[5094]: E0220 07:05:16.986744 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert podName:57b4cb2c-e7bc-4430-bfb8-3642dab61d84 nodeName:}" failed. No retries permitted until 2026-02-20 07:05:32.986720134 +0000 UTC m=+1147.859346845 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert") pod "openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" (UID: "57b4cb2c-e7bc-4430-bfb8-3642dab61d84") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.179590 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:e4689246ae78635dc3c1db9c677d8b16b8f94276df15fb9c84bfc57cc6578fcf\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" podUID="eba1b8e0-b529-47ad-a657-75ce01bad56a" Feb 20 07:05:17 crc kubenswrapper[5094]: I0220 07:05:17.295729 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:17 crc kubenswrapper[5094]: I0220 07:05:17.295817 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.296024 5094 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.296056 5094 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.296093 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:33.296075492 +0000 UTC m=+1148.168702203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "metrics-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.296186 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs podName:a1b74404-906b-4466-a3bd-289458ef90ea nodeName:}" failed. No retries permitted until 2026-02-20 07:05:33.296158354 +0000 UTC m=+1148.168785155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs") pod "openstack-operator-controller-manager-69ff7bc449-57z9v" (UID: "a1b74404-906b-4466-a3bd-289458ef90ea") : secret "webhook-server-cert" not found Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.410268 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.410502 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6rc9s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-2ftdz_openstack-operators(93dbc041-00c2-4189-abca-6bb3a00abc2d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.411722 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" podUID="93dbc041-00c2-4189-abca-6bb3a00abc2d" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.982039 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.982403 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tgzj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-86bxl_openstack-operators(8a1c02cd-3546-45fa-b7db-5903c80681a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:05:17 crc kubenswrapper[5094]: E0220 07:05:17.983664 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" podUID="8a1c02cd-3546-45fa-b7db-5903c80681a4" Feb 20 07:05:18 crc kubenswrapper[5094]: E0220 07:05:18.185682 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" podUID="8a1c02cd-3546-45fa-b7db-5903c80681a4" Feb 20 07:05:18 crc kubenswrapper[5094]: E0220 07:05:18.186053 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" podUID="93dbc041-00c2-4189-abca-6bb3a00abc2d" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.203022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" event={"ID":"683351ac-f508-4961-b07a-eaac9c26a4f3","Type":"ContainerStarted","Data":"636f0837c2e6575d9e75e03bc4f8506710b5cc62fa99829f7eecb7b7ec543aaa"} Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.203553 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.205932 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" event={"ID":"fcf15128-56ef-42dc-b230-1cd8b7638d33","Type":"ContainerStarted","Data":"473fcdb9f8b94742ba0beb96434c6166f8469323ad6c9b75af3c10d454df84ef"} Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.206082 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.208659 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" event={"ID":"6177108e-bc02-497c-80ab-312f61fbd1c2","Type":"ContainerStarted","Data":"a16e4827a448d0324601fa9c38fa56da5f23ba1117c3be7efb0bc1bcb40517c2"} Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.209025 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.210179 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" event={"ID":"c510ecc1-53ce-4611-af6a-09488f9317ed","Type":"ContainerStarted","Data":"e8a259b0c995ebbe163e1105c929a394dabd010c2fd03ca3ac6aa334dd10430c"} Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.210368 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.229883 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" podStartSLOduration=3.140056058 podStartE2EDuration="20.229847614s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.108538584 +0000 UTC m=+1116.981165295" lastFinishedPulling="2026-02-20 07:05:19.19833014 +0000 UTC m=+1134.070956851" observedRunningTime="2026-02-20 07:05:20.217861757 +0000 UTC m=+1135.090488468" watchObservedRunningTime="2026-02-20 07:05:20.229847614 +0000 UTC m=+1135.102474325" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.260177 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" podStartSLOduration=3.124349291 podStartE2EDuration="20.26015113s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.062178583 +0000 UTC m=+1116.934805294" lastFinishedPulling="2026-02-20 07:05:19.197980422 +0000 UTC m=+1134.070607133" observedRunningTime="2026-02-20 07:05:20.23886663 +0000 UTC m=+1135.111493341" watchObservedRunningTime="2026-02-20 07:05:20.26015113 +0000 UTC m=+1135.132777841" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.260887 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" podStartSLOduration=4.554018202 podStartE2EDuration="20.260879888s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.240085414 +0000 UTC m=+1117.112712125" lastFinishedPulling="2026-02-20 07:05:17.94694708 +0000 UTC m=+1132.819573811" observedRunningTime="2026-02-20 07:05:20.256648076 +0000 UTC m=+1135.129274787" watchObservedRunningTime="2026-02-20 07:05:20.260879888 +0000 UTC m=+1135.133506599" Feb 20 07:05:20 crc kubenswrapper[5094]: I0220 07:05:20.282350 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" podStartSLOduration=3.132777944 podStartE2EDuration="20.282328721s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.051024626 +0000 UTC m=+1116.923651337" lastFinishedPulling="2026-02-20 07:05:19.200575403 +0000 UTC m=+1134.073202114" observedRunningTime="2026-02-20 07:05:20.280075647 +0000 UTC m=+1135.152702358" watchObservedRunningTime="2026-02-20 07:05:20.282328721 +0000 UTC m=+1135.154955432" Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.281556 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" event={"ID":"a91a9b82-fc6b-4900-becb-6dc3c100e429","Type":"ContainerStarted","Data":"b03564b29955f9e95c748c28ab6fecc3eeb609037719f6f48604e68481b2f640"} Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.282784 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.287763 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" event={"ID":"292cb132-b03c-4d20-8bee-c90ad3c4486b","Type":"ContainerStarted","Data":"8b4ed3acff59323b9b5362b8c86698480082e25a2f5d7ba8ba718f3c183b82f1"} Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.288843 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.307798 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" podStartSLOduration=11.377370575 podStartE2EDuration="27.307772375s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.015474185 +0000 UTC m=+1116.888100886" lastFinishedPulling="2026-02-20 07:05:17.945875975 +0000 UTC m=+1132.818502686" observedRunningTime="2026-02-20 07:05:27.300340866 +0000 UTC m=+1142.172967587" watchObservedRunningTime="2026-02-20 07:05:27.307772375 +0000 UTC m=+1142.180399106" Feb 20 07:05:27 crc kubenswrapper[5094]: I0220 07:05:27.325981 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" podStartSLOduration=11.087351749 podStartE2EDuration="27.32594919s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:01.707255163 +0000 UTC m=+1116.579881874" lastFinishedPulling="2026-02-20 07:05:17.945852614 +0000 UTC m=+1132.818479315" observedRunningTime="2026-02-20 07:05:27.320533331 +0000 UTC m=+1142.193160042" watchObservedRunningTime="2026-02-20 07:05:27.32594919 +0000 UTC m=+1142.198575921" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.329970 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" event={"ID":"c45dcc1f-a95d-4492-9139-16d550809a8e","Type":"ContainerStarted","Data":"1cb81276893e4adf3bf82e13819f46c33ccc912a39584e0cad9bf24855f4bb33"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.331801 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.335912 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" event={"ID":"fce6c9b3-2075-479d-9a16-738831a871c4","Type":"ContainerStarted","Data":"3cc28b52343224699ce5bec9612f82ea520b5f3fb0d60b0eb025cfedec2b6aae"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.336366 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.337925 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" event={"ID":"36d60210-52d5-4f28-ae0b-28cce632d5cb","Type":"ContainerStarted","Data":"3c0c2ea8b0cbf1c81bd5a6ba841d7dea29d115d4686f0e0a075a6cf71285e8b2"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.338052 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.339522 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" event={"ID":"6b09cc76-8cba-42ed-bb2c-fdf4473c9afe","Type":"ContainerStarted","Data":"d8c9ae7389671c7b3dd5af71666815e4bcf18f376d81ba1876d1b4989a6699f3"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.339643 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.362541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" event={"ID":"32338b54-c33f-4dc5-b328-9cf4d92d1db6","Type":"ContainerStarted","Data":"3b8131ad97940a5d11910fe6adada88383a8d3997e80bed72efd5db8a3b2230c"} Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.362590 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.393431 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" podStartSLOduration=11.810348723 podStartE2EDuration="29.393391593s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:01.61736919 +0000 UTC m=+1116.489995901" lastFinishedPulling="2026-02-20 07:05:19.20041206 +0000 UTC m=+1134.073038771" observedRunningTime="2026-02-20 07:05:29.385637188 +0000 UTC m=+1144.258263899" watchObservedRunningTime="2026-02-20 07:05:29.393391593 +0000 UTC m=+1144.266018304" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.395458 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" podStartSLOduration=12.62146673 podStartE2EDuration="29.395450443s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.424290536 +0000 UTC m=+1117.296917267" lastFinishedPulling="2026-02-20 07:05:19.198274269 +0000 UTC m=+1134.070900980" observedRunningTime="2026-02-20 07:05:29.370554327 +0000 UTC m=+1144.243181038" watchObservedRunningTime="2026-02-20 07:05:29.395450443 +0000 UTC m=+1144.268077154" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.412097 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" podStartSLOduration=11.577333024 podStartE2EDuration="29.412073531s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:01.363335547 +0000 UTC m=+1116.235962268" lastFinishedPulling="2026-02-20 07:05:19.198076064 +0000 UTC m=+1134.070702775" observedRunningTime="2026-02-20 07:05:29.409862198 +0000 UTC m=+1144.282488909" watchObservedRunningTime="2026-02-20 07:05:29.412073531 +0000 UTC m=+1144.284700242" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.438400 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" podStartSLOduration=12.270509115 podStartE2EDuration="29.438382211s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.032461072 +0000 UTC m=+1116.905087783" lastFinishedPulling="2026-02-20 07:05:19.200334168 +0000 UTC m=+1134.072960879" observedRunningTime="2026-02-20 07:05:29.437229224 +0000 UTC m=+1144.309855935" watchObservedRunningTime="2026-02-20 07:05:29.438382211 +0000 UTC m=+1144.311008922" Feb 20 07:05:29 crc kubenswrapper[5094]: I0220 07:05:29.469170 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" podStartSLOduration=12.610971068 podStartE2EDuration="29.469145228s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.405884884 +0000 UTC m=+1117.278511595" lastFinishedPulling="2026-02-20 07:05:19.264059044 +0000 UTC m=+1134.136685755" observedRunningTime="2026-02-20 07:05:29.461648748 +0000 UTC m=+1144.334275459" watchObservedRunningTime="2026-02-20 07:05:29.469145228 +0000 UTC m=+1144.341771939" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.371019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" event={"ID":"f45a4211-8890-4e4a-af96-ccffec62160c","Type":"ContainerStarted","Data":"d6b14c87e61a5ce0c28866ab7791e9f9c288b6ac76ac375e138e71d8ed8c709a"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.373011 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" event={"ID":"f6c8e20e-ecca-42d4-9e0e-5547ae567d9f","Type":"ContainerStarted","Data":"86ae7dab61271ba568191fd4743d167f4d87e005eddbbba8b77fee05909c1284"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.373314 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.374940 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" event={"ID":"8a1c02cd-3546-45fa-b7db-5903c80681a4","Type":"ContainerStarted","Data":"3684e8b81fc66683808c13368b66854e3de68171ce23e01070b8e0970743c3c4"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.375324 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.379234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" event={"ID":"a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5","Type":"ContainerStarted","Data":"0c90819418a8e3dd75259c7f23eb53e15d434c602fe6a4469131d205261dd6d7"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.379416 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.381022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" event={"ID":"b863d4f9-063a-4102-8c3d-f7e092e4e2c0","Type":"ContainerStarted","Data":"baaccfe1c5f13a3900e7cc76f1d91037031f5b1f68927234f39ff96baa8f2cd2"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.381297 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.382621 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" event={"ID":"74861845-de37-4091-9226-bcb1bbe64b35","Type":"ContainerStarted","Data":"1c7b8de53e9ca51fc133d387dcd05968671c66159d44551b9a5ca284e5643cdf"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.382987 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.385084 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" event={"ID":"93dbc041-00c2-4189-abca-6bb3a00abc2d","Type":"ContainerStarted","Data":"2e8c12073614c7188ef7925e53ed028ed710e9a944b9c21866b720f228e64a3f"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.385477 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.391743 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" event={"ID":"4413dc36-58b0-447a-ba69-cdd2cee9589c","Type":"ContainerStarted","Data":"831a296804a99995767f55cd675f9382bdb0141f31d26430c827697dd599456a"} Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.392099 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.397783 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-fq9n6" podStartSLOduration=2.595486685 podStartE2EDuration="29.397771868s" podCreationTimestamp="2026-02-20 07:05:01 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.433370493 +0000 UTC m=+1117.305997214" lastFinishedPulling="2026-02-20 07:05:29.235655676 +0000 UTC m=+1144.108282397" observedRunningTime="2026-02-20 07:05:30.393793253 +0000 UTC m=+1145.266419964" watchObservedRunningTime="2026-02-20 07:05:30.397771868 +0000 UTC m=+1145.270398579" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.423595 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" podStartSLOduration=2.71215947 podStartE2EDuration="30.423576326s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:01.692877249 +0000 UTC m=+1116.565503960" lastFinishedPulling="2026-02-20 07:05:29.404294105 +0000 UTC m=+1144.276920816" observedRunningTime="2026-02-20 07:05:30.415909923 +0000 UTC m=+1145.288536634" watchObservedRunningTime="2026-02-20 07:05:30.423576326 +0000 UTC m=+1145.296203037" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.441148 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" podStartSLOduration=3.559958525 podStartE2EDuration="30.441134597s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.24241084 +0000 UTC m=+1117.115037551" lastFinishedPulling="2026-02-20 07:05:29.123586902 +0000 UTC m=+1143.996213623" observedRunningTime="2026-02-20 07:05:30.436870034 +0000 UTC m=+1145.309496745" watchObservedRunningTime="2026-02-20 07:05:30.441134597 +0000 UTC m=+1145.313761298" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.465367 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" podStartSLOduration=3.112861476 podStartE2EDuration="30.465348706s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.140732595 +0000 UTC m=+1117.013359306" lastFinishedPulling="2026-02-20 07:05:29.493219825 +0000 UTC m=+1144.365846536" observedRunningTime="2026-02-20 07:05:30.460281895 +0000 UTC m=+1145.332908606" watchObservedRunningTime="2026-02-20 07:05:30.465348706 +0000 UTC m=+1145.337975417" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.486093 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" podStartSLOduration=3.6046175639999998 podStartE2EDuration="30.486075853s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.240556465 +0000 UTC m=+1117.113183176" lastFinishedPulling="2026-02-20 07:05:29.122014744 +0000 UTC m=+1143.994641465" observedRunningTime="2026-02-20 07:05:30.482416025 +0000 UTC m=+1145.355042736" watchObservedRunningTime="2026-02-20 07:05:30.486075853 +0000 UTC m=+1145.358702564" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.512737 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" podStartSLOduration=13.442435192 podStartE2EDuration="30.512721721s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.242660836 +0000 UTC m=+1117.115287547" lastFinishedPulling="2026-02-20 07:05:19.312947365 +0000 UTC m=+1134.185574076" observedRunningTime="2026-02-20 07:05:30.50812772 +0000 UTC m=+1145.380754431" watchObservedRunningTime="2026-02-20 07:05:30.512721721 +0000 UTC m=+1145.385348432" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.534253 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" podStartSLOduration=3.168427447 podStartE2EDuration="30.534233666s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.051019546 +0000 UTC m=+1116.923646247" lastFinishedPulling="2026-02-20 07:05:29.416825765 +0000 UTC m=+1144.289452466" observedRunningTime="2026-02-20 07:05:30.527583396 +0000 UTC m=+1145.400210107" watchObservedRunningTime="2026-02-20 07:05:30.534233666 +0000 UTC m=+1145.406860367" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.575218 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" podStartSLOduration=3.481030053 podStartE2EDuration="30.575206127s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.138689926 +0000 UTC m=+1117.011316637" lastFinishedPulling="2026-02-20 07:05:29.232866 +0000 UTC m=+1144.105492711" observedRunningTime="2026-02-20 07:05:30.573417744 +0000 UTC m=+1145.446044455" watchObservedRunningTime="2026-02-20 07:05:30.575206127 +0000 UTC m=+1145.447832838" Feb 20 07:05:30 crc kubenswrapper[5094]: I0220 07:05:30.919281 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-8j8pv" Feb 20 07:05:31 crc kubenswrapper[5094]: I0220 07:05:31.027497 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-n5dgn" Feb 20 07:05:31 crc kubenswrapper[5094]: I0220 07:05:31.305692 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-ngkcq" Feb 20 07:05:31 crc kubenswrapper[5094]: I0220 07:05:31.391889 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-9c2k5" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.347063 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.363942 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/eb67a9bc-35a6-4ce3-bca8-a08ee824cda7-cert\") pod \"infra-operator-controller-manager-79d975b745-nxtc7\" (UID: \"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.439824 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-twhdx" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.441929 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" event={"ID":"eba1b8e0-b529-47ad-a657-75ce01bad56a","Type":"ContainerStarted","Data":"dc9033c6796474544bbfb7242972c4be3d2a328bec53f134465730ab0b3828d4"} Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.442316 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.445016 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.482275 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" podStartSLOduration=3.321730228 podStartE2EDuration="32.482250589s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:02.111111535 +0000 UTC m=+1116.983738246" lastFinishedPulling="2026-02-20 07:05:31.271631896 +0000 UTC m=+1146.144258607" observedRunningTime="2026-02-20 07:05:32.479585775 +0000 UTC m=+1147.352212526" watchObservedRunningTime="2026-02-20 07:05:32.482250589 +0000 UTC m=+1147.354877300" Feb 20 07:05:32 crc kubenswrapper[5094]: I0220 07:05:32.952763 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7"] Feb 20 07:05:32 crc kubenswrapper[5094]: W0220 07:05:32.962143 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb67a9bc_35a6_4ce3_bca8_a08ee824cda7.slice/crio-da2ff4eb721f51bbecbf21ba917d3c7cfb5dd4ea81f818d4707cd8d3327c0b3a WatchSource:0}: Error finding container da2ff4eb721f51bbecbf21ba917d3c7cfb5dd4ea81f818d4707cd8d3327c0b3a: Status 404 returned error can't find the container with id da2ff4eb721f51bbecbf21ba917d3c7cfb5dd4ea81f818d4707cd8d3327c0b3a Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.062770 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.070564 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57b4cb2c-e7bc-4430-bfb8-3642dab61d84-cert\") pod \"openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8\" (UID: \"57b4cb2c-e7bc-4430-bfb8-3642dab61d84\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.143687 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-j64q2" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.150978 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.367517 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.368059 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.374734 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-metrics-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.375390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a1b74404-906b-4466-a3bd-289458ef90ea-webhook-certs\") pod \"openstack-operator-controller-manager-69ff7bc449-57z9v\" (UID: \"a1b74404-906b-4466-a3bd-289458ef90ea\") " pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.457501 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-prrqj" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.457886 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" event={"ID":"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7","Type":"ContainerStarted","Data":"da2ff4eb721f51bbecbf21ba917d3c7cfb5dd4ea81f818d4707cd8d3327c0b3a"} Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.464582 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.639082 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8"] Feb 20 07:05:33 crc kubenswrapper[5094]: W0220 07:05:33.651914 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b4cb2c_e7bc_4430_bfb8_3642dab61d84.slice/crio-d6d5c8723b1634e26915dcb313cde75827ae5d73ad72295741ffd667abb304c9 WatchSource:0}: Error finding container d6d5c8723b1634e26915dcb313cde75827ae5d73ad72295741ffd667abb304c9: Status 404 returned error can't find the container with id d6d5c8723b1634e26915dcb313cde75827ae5d73ad72295741ffd667abb304c9 Feb 20 07:05:33 crc kubenswrapper[5094]: I0220 07:05:33.970951 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v"] Feb 20 07:05:33 crc kubenswrapper[5094]: W0220 07:05:33.973473 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b74404_906b_4466_a3bd_289458ef90ea.slice/crio-26b1776deae46d8de81c72bfe440be76c6862386fabf2ba453cdedf61ff83dbe WatchSource:0}: Error finding container 26b1776deae46d8de81c72bfe440be76c6862386fabf2ba453cdedf61ff83dbe: Status 404 returned error can't find the container with id 26b1776deae46d8de81c72bfe440be76c6862386fabf2ba453cdedf61ff83dbe Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.469188 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" event={"ID":"a1b74404-906b-4466-a3bd-289458ef90ea","Type":"ContainerStarted","Data":"92972e17b7597d7d246ec89d508ca3dc503e4c27b0890de6d8d8f76defbcf1dd"} Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.469261 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" event={"ID":"a1b74404-906b-4466-a3bd-289458ef90ea","Type":"ContainerStarted","Data":"26b1776deae46d8de81c72bfe440be76c6862386fabf2ba453cdedf61ff83dbe"} Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.469401 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.472869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" event={"ID":"57b4cb2c-e7bc-4430-bfb8-3642dab61d84","Type":"ContainerStarted","Data":"d6d5c8723b1634e26915dcb313cde75827ae5d73ad72295741ffd667abb304c9"} Feb 20 07:05:34 crc kubenswrapper[5094]: I0220 07:05:34.503330 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" podStartSLOduration=34.503304972 podStartE2EDuration="34.503304972s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:05:34.496800876 +0000 UTC m=+1149.369427597" watchObservedRunningTime="2026-02-20 07:05:34.503304972 +0000 UTC m=+1149.375931693" Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.490762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" event={"ID":"eb67a9bc-35a6-4ce3-bca8-a08ee824cda7","Type":"ContainerStarted","Data":"f8dbef55b69f2b26129eb4aff60bb4f83ca3198a9863d5f2edc242f42986c98a"} Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.491236 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.493802 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" event={"ID":"57b4cb2c-e7bc-4430-bfb8-3642dab61d84","Type":"ContainerStarted","Data":"9b0b47dfc0d39af6f26080c06a44ddae0ffc8f865a0a51586ed513c1d57933a7"} Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.493902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.517006 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" podStartSLOduration=33.755968694 podStartE2EDuration="36.516982898s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:32.965450931 +0000 UTC m=+1147.838077682" lastFinishedPulling="2026-02-20 07:05:35.726465165 +0000 UTC m=+1150.599091886" observedRunningTime="2026-02-20 07:05:36.514099829 +0000 UTC m=+1151.386726560" watchObservedRunningTime="2026-02-20 07:05:36.516982898 +0000 UTC m=+1151.389609609" Feb 20 07:05:36 crc kubenswrapper[5094]: I0220 07:05:36.560933 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" podStartSLOduration=34.500328411 podStartE2EDuration="36.56089751s" podCreationTimestamp="2026-02-20 07:05:00 +0000 UTC" firstStartedPulling="2026-02-20 07:05:33.661869031 +0000 UTC m=+1148.534495752" lastFinishedPulling="2026-02-20 07:05:35.7224381 +0000 UTC m=+1150.595064851" observedRunningTime="2026-02-20 07:05:36.552414776 +0000 UTC m=+1151.425041517" watchObservedRunningTime="2026-02-20 07:05:36.56089751 +0000 UTC m=+1151.433524261" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.480274 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-k5dkn" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.498503 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-24cv7" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.533947 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-26vtn" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.535344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-xjng5" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.594969 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-p689m" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.622975 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-mnd7v" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.967572 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-86bxl" Feb 20 07:05:40 crc kubenswrapper[5094]: I0220 07:05:40.994774 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-hfkff" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.263036 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-2ftdz" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.293617 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-bd9tr" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.349859 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ct2h7" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.419832 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-dh48q" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.431648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-cbtkd" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.550460 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-nll74" Feb 20 07:05:41 crc kubenswrapper[5094]: I0220 07:05:41.603607 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-lz57q" Feb 20 07:05:42 crc kubenswrapper[5094]: I0220 07:05:42.460833 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-nxtc7" Feb 20 07:05:43 crc kubenswrapper[5094]: I0220 07:05:43.160628 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8" Feb 20 07:05:43 crc kubenswrapper[5094]: I0220 07:05:43.474639 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-69ff7bc449-57z9v" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.087791 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.089628 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.095243 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.095311 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.095552 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.095262 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-pv4gg" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.103952 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.191026 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.191551 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.208674 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.210153 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.212190 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.223290 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293541 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293717 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293741 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.293777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.294575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.312755 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") pod \"dnsmasq-dns-855cbc58c5-tz76m\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.395719 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.395865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.395907 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.397112 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.398999 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.407801 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.433991 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") pod \"dnsmasq-dns-6fcf94d689-z44mn\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.535195 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:05:59 crc kubenswrapper[5094]: I0220 07:05:59.896448 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:05:59 crc kubenswrapper[5094]: W0220 07:05:59.907869 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9654de45_6750_4105_a4db_050ae521d91c.slice/crio-4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd WatchSource:0}: Error finding container 4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd: Status 404 returned error can't find the container with id 4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd Feb 20 07:06:00 crc kubenswrapper[5094]: I0220 07:06:00.050052 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:06:00 crc kubenswrapper[5094]: I0220 07:06:00.732932 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" event={"ID":"ac34a5f7-69bb-416d-994d-056f5f1513e8","Type":"ContainerStarted","Data":"349408517b5e79dc8415f7cc916f79548181c386e9ff1081d09f8f6e492cac5c"} Feb 20 07:06:00 crc kubenswrapper[5094]: I0220 07:06:00.735052 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" event={"ID":"9654de45-6750-4105-a4db-050ae521d91c","Type":"ContainerStarted","Data":"4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd"} Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.159810 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.179757 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.181078 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.238035 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.347117 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.347197 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.348561 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.450668 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.450785 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.450838 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.453597 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.461549 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.477003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") pod \"dnsmasq-dns-6d6b9fdb89-jpfm5\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.523545 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.777133 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.808526 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.810179 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.822886 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.932047 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:01 crc kubenswrapper[5094]: W0220 07:06:01.947165 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91df42cd_5b04_4d4c_862b_c9ccbb1b488d.slice/crio-b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed WatchSource:0}: Error finding container b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed: Status 404 returned error can't find the container with id b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.960601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.961113 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:01 crc kubenswrapper[5094]: I0220 07:06:01.961217 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.062685 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.062821 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.062848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.063803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.064387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.084788 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") pod \"dnsmasq-dns-67ff45466c-kfk8f\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.137294 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.323829 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.325689 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.329218 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.329394 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.330007 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.330189 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.330319 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.331941 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.332412 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.332689 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-2rc9x" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468500 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468527 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468558 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468581 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468607 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468632 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468658 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468684 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468730 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.468758 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.570822 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.570931 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.570991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571023 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571054 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571119 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571183 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571222 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571267 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571881 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.571940 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.572295 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.572626 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.572789 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.572807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.579108 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.580080 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.583821 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.588725 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.589931 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.607017 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.644344 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.673563 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:02 crc kubenswrapper[5094]: W0220 07:06:02.717859 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16affe7_2f3d_438d_98b8_deedcf70053c.slice/crio-5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2 WatchSource:0}: Error finding container 5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2: Status 404 returned error can't find the container with id 5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2 Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.771586 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" event={"ID":"91df42cd-5b04-4d4c-862b-c9ccbb1b488d","Type":"ContainerStarted","Data":"b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed"} Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.776831 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" event={"ID":"e16affe7-2f3d-438d-98b8-deedcf70053c","Type":"ContainerStarted","Data":"5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2"} Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.904948 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.906794 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911349 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911635 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911686 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911729 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-9fv2z" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.911727 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.916808 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.917034 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 20 07:06:02 crc kubenswrapper[5094]: I0220 07:06:02.931184 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086189 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086290 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086328 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086399 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086443 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086494 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086518 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.086536 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.187909 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188144 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188608 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188727 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.188824 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.189122 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.189210 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.189431 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.189470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.190406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.190973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.196003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.196300 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.197032 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.205670 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.213515 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.237285 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.241874 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-server-0\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: W0220 07:06:03.254053 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda829c6b3_7069_4544_90dc_40ae83aba524.slice/crio-85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7 WatchSource:0}: Error finding container 85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7: Status 404 returned error can't find the container with id 85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7 Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.545395 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 07:06:03 crc kubenswrapper[5094]: I0220 07:06:03.792289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerStarted","Data":"85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7"} Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.100438 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:06:04 crc kubenswrapper[5094]: W0220 07:06:04.127223 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod219c74d6_9f45_4bf8_8c67_acdea3c0fab3.slice/crio-a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17 WatchSource:0}: Error finding container a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17: Status 404 returned error can't find the container with id a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17 Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.279909 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.281396 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.285002 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.287458 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cp9vz" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.287472 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.287931 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.292514 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.302283 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413437 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413511 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413579 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413726 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.413963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.414057 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.414149 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525620 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525681 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525819 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525839 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525898 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.525923 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.526184 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.527092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.533797 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.534489 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.534939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.537963 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.539604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.545173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.573254 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") pod \"openstack-galera-0\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.617562 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 07:06:04 crc kubenswrapper[5094]: I0220 07:06:04.830280 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerStarted","Data":"a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17"} Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.615319 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.616936 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.621808 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.622079 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-bng96" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.622208 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.625117 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.631233 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752002 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752112 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752161 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752197 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752262 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752350 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752370 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.752424 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859234 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859342 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859393 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859465 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859493 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.859598 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.863178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.863914 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.870065 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.871376 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.872428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.884651 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.891125 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.895812 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.899792 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.901631 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.901744 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.907311 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-vr5c7" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.907633 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.907685 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.940831 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-cell1-galera-0\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:05 crc kubenswrapper[5094]: I0220 07:06:05.961817 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.064908 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.064958 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.065019 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.065085 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.065118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167727 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167779 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167838 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167914 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.167943 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.170066 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.171043 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.171600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.176814 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.191966 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") pod \"memcached-0\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " pod="openstack/memcached-0" Feb 20 07:06:06 crc kubenswrapper[5094]: I0220 07:06:06.292093 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.152866 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.154234 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.159524 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-qqz9g" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.167989 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.305343 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") pod \"kube-state-metrics-0\" (UID: \"715094df-6704-4332-b990-95d790fd5ff1\") " pod="openstack/kube-state-metrics-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.406358 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") pod \"kube-state-metrics-0\" (UID: \"715094df-6704-4332-b990-95d790fd5ff1\") " pod="openstack/kube-state-metrics-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.434180 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") pod \"kube-state-metrics-0\" (UID: \"715094df-6704-4332-b990-95d790fd5ff1\") " pod="openstack/kube-state-metrics-0" Feb 20 07:06:08 crc kubenswrapper[5094]: I0220 07:06:08.475650 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.490287 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.493175 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.496851 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.497132 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.502058 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-vpr9w" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.536249 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.554803 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.557883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.567724 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.593776 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594141 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594267 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594379 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.594983 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699265 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699369 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699413 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699444 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699469 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699495 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699612 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699648 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699857 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699878 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699906 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.699931 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.701636 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.701820 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.708326 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.708359 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.735992 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.739317 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.753353 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") pod \"ovn-controller-lvlr2\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801367 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801513 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801601 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801730 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801876 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801898 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.801930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.803822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.818105 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.844577 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") pod \"ovn-controller-ovs-tj42x\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:11 crc kubenswrapper[5094]: I0220 07:06:11.924158 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.056635 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.058937 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.063911 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.064222 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-26v89" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.064582 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.064951 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.072960 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.074134 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209009 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209094 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209120 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209139 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209189 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209209 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.209232 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313027 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313763 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313825 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313893 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313928 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.313967 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.314003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.314023 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.314414 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.314944 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.318719 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.318956 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.322520 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.326543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.339889 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.348798 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.351882 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:12 crc kubenswrapper[5094]: I0220 07:06:12.391906 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:14 crc kubenswrapper[5094]: I0220 07:06:14.883866 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.705875 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.709043 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.713640 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.714010 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.714051 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-9xcbp" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.716800 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.719530 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.812537 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.812662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.813963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815458 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815849 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.815986 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.917830 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918045 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918214 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918286 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918335 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918367 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918786 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.918995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.919217 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.919899 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.921534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.926197 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.926659 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.926761 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.936038 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:15 crc kubenswrapper[5094]: I0220 07:06:15.960300 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:16 crc kubenswrapper[5094]: I0220 07:06:16.030450 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:20 crc kubenswrapper[5094]: I0220 07:06:20.036179 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerStarted","Data":"ab6b9e58f533ca8387a46ffe5e0cb304794c4450b59c803c80417c57e86e76ef"} Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.615859 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.616091 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x746k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-67ff45466c-kfk8f_openstack(e16affe7-2f3d-438d-98b8-deedcf70053c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.617282 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" podUID="e16affe7-2f3d-438d-98b8-deedcf70053c" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.631881 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.632071 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zn97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6d6b9fdb89-jpfm5_openstack(91df42cd-5b04-4d4c-862b-c9ccbb1b488d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.633176 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.637394 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.637612 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gc7th,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-z44mn_openstack(ac34a5f7-69bb-416d-994d-056f5f1513e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.640025 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" podUID="ac34a5f7-69bb-416d-994d-056f5f1513e8" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.762417 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.762948 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmdmc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-tz76m_openstack(9654de45-6750-4105-a4db-050ae521d91c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:06:20 crc kubenswrapper[5094]: E0220 07:06:20.764262 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" podUID="9654de45-6750-4105-a4db-050ae521d91c" Feb 20 07:06:21 crc kubenswrapper[5094]: E0220 07:06:21.047298 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" podUID="e16affe7-2f3d-438d-98b8-deedcf70053c" Feb 20 07:06:21 crc kubenswrapper[5094]: E0220 07:06:21.047352 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2\\\"\"" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.169126 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.204330 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:06:21 crc kubenswrapper[5094]: W0220 07:06:21.207343 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod715094df_6704_4332_b990_95d790fd5ff1.slice/crio-3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3 WatchSource:0}: Error finding container 3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3: Status 404 returned error can't find the container with id 3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3 Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.288224 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:06:21 crc kubenswrapper[5094]: W0220 07:06:21.306687 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07969dc9_1a07_455c_b6c4_6b5f3bb23cb9.slice/crio-6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5 WatchSource:0}: Error finding container 6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5: Status 404 returned error can't find the container with id 6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5 Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.509493 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:06:21 crc kubenswrapper[5094]: W0220 07:06:21.514955 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcadd011d_8dde_4346_8608_c5f74376204d.slice/crio-066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a WatchSource:0}: Error finding container 066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a: Status 404 returned error can't find the container with id 066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.557311 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.569178 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.582882 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.586668 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:06:21 crc kubenswrapper[5094]: W0220 07:06:21.600032 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ecb5d91_5ba1_457e_af42_0d78c8643250.slice/crio-af003a9a5a57c0d56da81db8c0252ac6e5ab00ee16e46167572891f004ac0dbf WatchSource:0}: Error finding container af003a9a5a57c0d56da81db8c0252ac6e5ab00ee16e46167572891f004ac0dbf: Status 404 returned error can't find the container with id af003a9a5a57c0d56da81db8c0252ac6e5ab00ee16e46167572891f004ac0dbf Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.671570 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") pod \"ac34a5f7-69bb-416d-994d-056f5f1513e8\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.671882 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") pod \"ac34a5f7-69bb-416d-994d-056f5f1513e8\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672017 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") pod \"ac34a5f7-69bb-416d-994d-056f5f1513e8\" (UID: \"ac34a5f7-69bb-416d-994d-056f5f1513e8\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") pod \"9654de45-6750-4105-a4db-050ae521d91c\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672110 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") pod \"9654de45-6750-4105-a4db-050ae521d91c\" (UID: \"9654de45-6750-4105-a4db-050ae521d91c\") " Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672587 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config" (OuterVolumeSpecName: "config") pod "ac34a5f7-69bb-416d-994d-056f5f1513e8" (UID: "ac34a5f7-69bb-416d-994d-056f5f1513e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672449 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ac34a5f7-69bb-416d-994d-056f5f1513e8" (UID: "ac34a5f7-69bb-416d-994d-056f5f1513e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.672685 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config" (OuterVolumeSpecName: "config") pod "9654de45-6750-4105-a4db-050ae521d91c" (UID: "9654de45-6750-4105-a4db-050ae521d91c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.679519 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc" (OuterVolumeSpecName: "kube-api-access-vmdmc") pod "9654de45-6750-4105-a4db-050ae521d91c" (UID: "9654de45-6750-4105-a4db-050ae521d91c"). InnerVolumeSpecName "kube-api-access-vmdmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.679589 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th" (OuterVolumeSpecName: "kube-api-access-gc7th") pod "ac34a5f7-69bb-416d-994d-056f5f1513e8" (UID: "ac34a5f7-69bb-416d-994d-056f5f1513e8"). InnerVolumeSpecName "kube-api-access-gc7th". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774234 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmdmc\" (UniqueName: \"kubernetes.io/projected/9654de45-6750-4105-a4db-050ae521d91c-kube-api-access-vmdmc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774285 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774298 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac34a5f7-69bb-416d-994d-056f5f1513e8-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774313 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc7th\" (UniqueName: \"kubernetes.io/projected/ac34a5f7-69bb-416d-994d-056f5f1513e8-kube-api-access-gc7th\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:21 crc kubenswrapper[5094]: I0220 07:06:21.774325 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9654de45-6750-4105-a4db-050ae521d91c-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.054165 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7dd0ff85-ae3a-4035-a096-fea5952b19a7","Type":"ContainerStarted","Data":"c905b3c584b4bdb1a44662bb87e5389e8137126047bfc23039edbbaea024118a"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.056748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2" event={"ID":"8ecb5d91-5ba1-457e-af42-0d78c8643250","Type":"ContainerStarted","Data":"af003a9a5a57c0d56da81db8c0252ac6e5ab00ee16e46167572891f004ac0dbf"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.058563 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerStarted","Data":"066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.060472 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerStarted","Data":"6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.062067 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.062071 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-z44mn" event={"ID":"ac34a5f7-69bb-416d-994d-056f5f1513e8","Type":"ContainerDied","Data":"349408517b5e79dc8415f7cc916f79548181c386e9ff1081d09f8f6e492cac5c"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.066939 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" event={"ID":"9654de45-6750-4105-a4db-050ae521d91c","Type":"ContainerDied","Data":"4ca5e7f2f7ecf776f59bc6a0cb3299deb8526fcf123e1301653c3165cbcd47dd"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.066979 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-tz76m" Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.069306 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"715094df-6704-4332-b990-95d790fd5ff1","Type":"ContainerStarted","Data":"3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.072098 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerStarted","Data":"69690ca4b7abd1bc1955c808ad93fa95a3a579fa31419e9ead102d78d2680915"} Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.114858 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.131163 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-z44mn"] Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.148083 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.155502 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-tz76m"] Feb 20 07:06:22 crc kubenswrapper[5094]: I0220 07:06:22.577047 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:06:23 crc kubenswrapper[5094]: I0220 07:06:23.102533 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerStarted","Data":"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c"} Feb 20 07:06:23 crc kubenswrapper[5094]: I0220 07:06:23.105415 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerStarted","Data":"0b471ed3a22d48304ba36ebcf4e6acbda6b19074d6e3a72832e1dda4c6f2f145"} Feb 20 07:06:23 crc kubenswrapper[5094]: W0220 07:06:23.524589 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3ec8857_5a33_44ea_bdd0_97b343adfc8a.slice/crio-c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9 WatchSource:0}: Error finding container c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9: Status 404 returned error can't find the container with id c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9 Feb 20 07:06:23 crc kubenswrapper[5094]: I0220 07:06:23.850678 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9654de45-6750-4105-a4db-050ae521d91c" path="/var/lib/kubelet/pods/9654de45-6750-4105-a4db-050ae521d91c/volumes" Feb 20 07:06:23 crc kubenswrapper[5094]: I0220 07:06:23.851210 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac34a5f7-69bb-416d-994d-056f5f1513e8" path="/var/lib/kubelet/pods/ac34a5f7-69bb-416d-994d-056f5f1513e8/volumes" Feb 20 07:06:24 crc kubenswrapper[5094]: I0220 07:06:24.115971 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerStarted","Data":"c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.195450 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"715094df-6704-4332-b990-95d790fd5ff1","Type":"ContainerStarted","Data":"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.196170 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.198799 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerStarted","Data":"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.201505 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7dd0ff85-ae3a-4035-a096-fea5952b19a7","Type":"ContainerStarted","Data":"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.202056 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.203993 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerStarted","Data":"87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.205848 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerStarted","Data":"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.211255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2" event={"ID":"8ecb5d91-5ba1-457e-af42-0d78c8643250","Type":"ContainerStarted","Data":"4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.219635 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=13.653726319 podStartE2EDuration="22.219611684s" podCreationTimestamp="2026-02-20 07:06:08 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.210328619 +0000 UTC m=+1196.082955330" lastFinishedPulling="2026-02-20 07:06:29.776213984 +0000 UTC m=+1204.648840695" observedRunningTime="2026-02-20 07:06:30.215932516 +0000 UTC m=+1205.088559227" watchObservedRunningTime="2026-02-20 07:06:30.219611684 +0000 UTC m=+1205.092238395" Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.240041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerStarted","Data":"333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.240738 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-lvlr2" podStartSLOduration=11.887803551 podStartE2EDuration="19.240683698s" podCreationTimestamp="2026-02-20 07:06:11 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.606231349 +0000 UTC m=+1196.478858060" lastFinishedPulling="2026-02-20 07:06:28.959111496 +0000 UTC m=+1203.831738207" observedRunningTime="2026-02-20 07:06:30.23616366 +0000 UTC m=+1205.108790371" watchObservedRunningTime="2026-02-20 07:06:30.240683698 +0000 UTC m=+1205.113310409" Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.244447 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerStarted","Data":"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf"} Feb 20 07:06:30 crc kubenswrapper[5094]: I0220 07:06:30.320766 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.106807956 podStartE2EDuration="25.320746315s" podCreationTimestamp="2026-02-20 07:06:05 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.182155543 +0000 UTC m=+1196.054782254" lastFinishedPulling="2026-02-20 07:06:28.396093892 +0000 UTC m=+1203.268720613" observedRunningTime="2026-02-20 07:06:30.312891648 +0000 UTC m=+1205.185518359" watchObservedRunningTime="2026-02-20 07:06:30.320746315 +0000 UTC m=+1205.193373026" Feb 20 07:06:31 crc kubenswrapper[5094]: I0220 07:06:31.256180 5094 generic.go:334] "Generic (PLEG): container finished" podID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerID="35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf" exitCode=0 Feb 20 07:06:31 crc kubenswrapper[5094]: I0220 07:06:31.257807 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerDied","Data":"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf"} Feb 20 07:06:31 crc kubenswrapper[5094]: I0220 07:06:31.258318 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-lvlr2" Feb 20 07:06:32 crc kubenswrapper[5094]: I0220 07:06:32.272034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerStarted","Data":"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d"} Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.285544 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerStarted","Data":"2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47"} Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.293241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerStarted","Data":"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a"} Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.293547 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.296173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerStarted","Data":"d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976"} Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.322507 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.074221776 podStartE2EDuration="22.322484745s" podCreationTimestamp="2026-02-20 07:06:11 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.518086088 +0000 UTC m=+1196.390712799" lastFinishedPulling="2026-02-20 07:06:31.766349057 +0000 UTC m=+1206.638975768" observedRunningTime="2026-02-20 07:06:33.319900723 +0000 UTC m=+1208.192527514" watchObservedRunningTime="2026-02-20 07:06:33.322484745 +0000 UTC m=+1208.195111466" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.342935 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=11.093758263 podStartE2EDuration="19.342914775s" podCreationTimestamp="2026-02-20 07:06:14 +0000 UTC" firstStartedPulling="2026-02-20 07:06:23.526733714 +0000 UTC m=+1198.399360425" lastFinishedPulling="2026-02-20 07:06:31.775890226 +0000 UTC m=+1206.648516937" observedRunningTime="2026-02-20 07:06:33.342432283 +0000 UTC m=+1208.215058984" watchObservedRunningTime="2026-02-20 07:06:33.342914775 +0000 UTC m=+1208.215541486" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.374846 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-tj42x" podStartSLOduration=14.752109311 podStartE2EDuration="22.374815059s" podCreationTimestamp="2026-02-20 07:06:11 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.309635647 +0000 UTC m=+1196.182262358" lastFinishedPulling="2026-02-20 07:06:28.932341385 +0000 UTC m=+1203.804968106" observedRunningTime="2026-02-20 07:06:33.36403313 +0000 UTC m=+1208.236659841" watchObservedRunningTime="2026-02-20 07:06:33.374815059 +0000 UTC m=+1208.247441780" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.393679 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:33 crc kubenswrapper[5094]: I0220 07:06:33.439330 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.031210 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.104806 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.309953 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerID="c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25" exitCode=0 Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.310113 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerDied","Data":"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25"} Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.313600 5094 generic.go:334] "Generic (PLEG): container finished" podID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerID="1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8" exitCode=0 Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.313660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerDied","Data":"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8"} Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.315505 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.315557 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.315581 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.384645 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.705343 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.732814 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.734769 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.737333 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.786680 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.818936 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.823035 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.829571 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.864817 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.869961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870090 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870139 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870162 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870211 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870259 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870333 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870448 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.870491 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972566 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972743 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972816 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972874 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972906 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.972963 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.973006 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.973048 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.974974 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.976092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.976652 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.977296 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.977407 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.977593 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.984351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.990180 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") pod \"dnsmasq-dns-5b9c664f7-sg5jw\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:34 crc kubenswrapper[5094]: I0220 07:06:34.995475 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.002498 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-gg2h9\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.108357 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.136015 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.137528 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.137630 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.144074 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.161661 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.177993 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.178093 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.178114 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.178339 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.178444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.184248 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.238024 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284195 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") pod \"e16affe7-2f3d-438d-98b8-deedcf70053c\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284306 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") pod \"e16affe7-2f3d-438d-98b8-deedcf70053c\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284424 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") pod \"e16affe7-2f3d-438d-98b8-deedcf70053c\" (UID: \"e16affe7-2f3d-438d-98b8-deedcf70053c\") " Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284790 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284897 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284950 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.284977 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.286092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.286913 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config" (OuterVolumeSpecName: "config") pod "e16affe7-2f3d-438d-98b8-deedcf70053c" (UID: "e16affe7-2f3d-438d-98b8-deedcf70053c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.287302 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e16affe7-2f3d-438d-98b8-deedcf70053c" (UID: "e16affe7-2f3d-438d-98b8-deedcf70053c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.290390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.290536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.291445 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.292436 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k" (OuterVolumeSpecName: "kube-api-access-x746k") pod "e16affe7-2f3d-438d-98b8-deedcf70053c" (UID: "e16affe7-2f3d-438d-98b8-deedcf70053c"). InnerVolumeSpecName "kube-api-access-x746k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.388108 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.388138 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x746k\" (UniqueName: \"kubernetes.io/projected/e16affe7-2f3d-438d-98b8-deedcf70053c-kube-api-access-x746k\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.388151 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e16affe7-2f3d-438d-98b8-deedcf70053c-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.389578 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") pod \"dnsmasq-dns-75b7bcc64f-wdrds\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.409186 5094 generic.go:334] "Generic (PLEG): container finished" podID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" containerID="1b34af8db883238b2e612d791bb36ef95778df03bbf733644cb48e464b571a49" exitCode=0 Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.409289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" event={"ID":"91df42cd-5b04-4d4c-862b-c9ccbb1b488d","Type":"ContainerDied","Data":"1b34af8db883238b2e612d791bb36ef95778df03bbf733644cb48e464b571a49"} Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.466338 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerStarted","Data":"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27"} Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.477389 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.495580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" event={"ID":"e16affe7-2f3d-438d-98b8-deedcf70053c","Type":"ContainerDied","Data":"5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2"} Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.495692 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-kfk8f" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.535600 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=24.973406419 podStartE2EDuration="32.535578277s" podCreationTimestamp="2026-02-20 07:06:03 +0000 UTC" firstStartedPulling="2026-02-20 07:06:21.595561745 +0000 UTC m=+1196.468188446" lastFinishedPulling="2026-02-20 07:06:29.157733563 +0000 UTC m=+1204.030360304" observedRunningTime="2026-02-20 07:06:35.514271417 +0000 UTC m=+1210.386898128" watchObservedRunningTime="2026-02-20 07:06:35.535578277 +0000 UTC m=+1210.408204988" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.544142 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerStarted","Data":"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14"} Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.604253 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.278498907 podStartE2EDuration="31.604228191s" podCreationTimestamp="2026-02-20 07:06:04 +0000 UTC" firstStartedPulling="2026-02-20 07:06:19.606502078 +0000 UTC m=+1194.479128819" lastFinishedPulling="2026-02-20 07:06:28.932231402 +0000 UTC m=+1203.804858103" observedRunningTime="2026-02-20 07:06:35.59082104 +0000 UTC m=+1210.463447751" watchObservedRunningTime="2026-02-20 07:06:35.604228191 +0000 UTC m=+1210.476854902" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.666939 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.695789 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.704156 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-kfk8f"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.710429 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:35 crc kubenswrapper[5094]: W0220 07:06:35.737757 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod304622e2_8b43_4cc6_b17b_79a03bc74a58.slice/crio-31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b WatchSource:0}: Error finding container 31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b: Status 404 returned error can't find the container with id 31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b Feb 20 07:06:35 crc kubenswrapper[5094]: E0220 07:06:35.810226 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16affe7_2f3d_438d_98b8_deedcf70053c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode16affe7_2f3d_438d_98b8_deedcf70053c.slice/crio-5a6851df1ef68fb3b4857718de007871d9f5899041eb1b42abb44d6c0d12abe2\": RecentStats: unable to find data in memory cache]" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.867864 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e16affe7-2f3d-438d-98b8-deedcf70053c" path="/var/lib/kubelet/pods/e16affe7-2f3d-438d-98b8-deedcf70053c/volumes" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.959941 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.963864 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.963902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.964031 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.970069 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.970726 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4w8fv" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.971000 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 20 07:06:35 crc kubenswrapper[5094]: I0220 07:06:35.972287 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:35.999557 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.008789 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.029870 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.072606 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.140576 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") pod \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.140954 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") pod \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.141253 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") pod \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\" (UID: \"91df42cd-5b04-4d4c-862b-c9ccbb1b488d\") " Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142031 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142194 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142334 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142526 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142631 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.142766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.145842 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97" (OuterVolumeSpecName: "kube-api-access-2zn97") pod "91df42cd-5b04-4d4c-862b-c9ccbb1b488d" (UID: "91df42cd-5b04-4d4c-862b-c9ccbb1b488d"). InnerVolumeSpecName "kube-api-access-2zn97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.160818 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config" (OuterVolumeSpecName: "config") pod "91df42cd-5b04-4d4c-862b-c9ccbb1b488d" (UID: "91df42cd-5b04-4d4c-862b-c9ccbb1b488d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.164523 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91df42cd-5b04-4d4c-862b-c9ccbb1b488d" (UID: "91df42cd-5b04-4d4c-862b-c9ccbb1b488d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.245219 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.245542 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.245783 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.245973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.246145 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.246318 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.246480 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.247523 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.247779 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.247916 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zn97\" (UniqueName: \"kubernetes.io/projected/91df42cd-5b04-4d4c-862b-c9ccbb1b488d-kube-api-access-2zn97\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.247353 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.248590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.248758 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.254890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.255013 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.255372 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.267130 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") pod \"ovn-northd-0\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.293852 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.307638 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.554954 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gg2h9" event={"ID":"ca32118b-2e77-4484-b753-3467e1ba8df1","Type":"ContainerStarted","Data":"ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.555339 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gg2h9" event={"ID":"ca32118b-2e77-4484-b753-3467e1ba8df1","Type":"ContainerStarted","Data":"8220b8fae0a21f561557c1539a6ab409db96f4d4c24a493c8737608b37dc4bc1"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.557525 5094 generic.go:334] "Generic (PLEG): container finished" podID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerID="f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82" exitCode=0 Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.557618 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerDied","Data":"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.557683 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerStarted","Data":"ee40b807024485305f67748b89b6f21b26eeabd4f3866126d5f1e66f00f01af7"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.559466 5094 generic.go:334] "Generic (PLEG): container finished" podID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerID="b7767b5d4adb615e7aa00736fc2736b07d8efff1a7193f90cf7bc360858c7390" exitCode=0 Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.559676 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerDied","Data":"b7767b5d4adb615e7aa00736fc2736b07d8efff1a7193f90cf7bc360858c7390"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.559722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerStarted","Data":"31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.563900 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" event={"ID":"91df42cd-5b04-4d4c-862b-c9ccbb1b488d","Type":"ContainerDied","Data":"b6889d1465b6da017b2081be2bef0da1f6bef33f4ba683f9fc9daa54a5c436ed"} Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.564006 5094 scope.go:117] "RemoveContainer" containerID="1b34af8db883238b2e612d791bb36ef95778df03bbf733644cb48e464b571a49" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.564015 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d6b9fdb89-jpfm5" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.581502 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-gg2h9" podStartSLOduration=2.581467015 podStartE2EDuration="2.581467015s" podCreationTimestamp="2026-02-20 07:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:36.578773951 +0000 UTC m=+1211.451400662" watchObservedRunningTime="2026-02-20 07:06:36.581467015 +0000 UTC m=+1211.454093726" Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.728738 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.738225 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d6b9fdb89-jpfm5"] Feb 20 07:06:36 crc kubenswrapper[5094]: I0220 07:06:36.829018 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:06:36 crc kubenswrapper[5094]: W0220 07:06:36.837852 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd92d75e_9882_4bb7_a41e_cab9777424e8.slice/crio-152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900 WatchSource:0}: Error finding container 152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900: Status 404 returned error can't find the container with id 152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900 Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.579770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerStarted","Data":"152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900"} Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.585634 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerStarted","Data":"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf"} Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.585998 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.588580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerStarted","Data":"c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e"} Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.628817 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" podStartSLOduration=2.628792958 podStartE2EDuration="2.628792958s" podCreationTimestamp="2026-02-20 07:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:37.624539066 +0000 UTC m=+1212.497165787" watchObservedRunningTime="2026-02-20 07:06:37.628792958 +0000 UTC m=+1212.501419679" Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.656500 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" podStartSLOduration=3.656478061 podStartE2EDuration="3.656478061s" podCreationTimestamp="2026-02-20 07:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:37.651497382 +0000 UTC m=+1212.524124103" watchObservedRunningTime="2026-02-20 07:06:37.656478061 +0000 UTC m=+1212.529104762" Feb 20 07:06:37 crc kubenswrapper[5094]: I0220 07:06:37.855430 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" path="/var/lib/kubelet/pods/91df42cd-5b04-4d4c-862b-c9ccbb1b488d/volumes" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.494405 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.497343 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.575597 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:06:38 crc kubenswrapper[5094]: E0220 07:06:38.576070 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" containerName="init" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.576090 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" containerName="init" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.576266 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="91df42cd-5b04-4d4c-862b-c9ccbb1b488d" containerName="init" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.577263 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592346 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592419 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592478 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.592520 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.603920 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.607696 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerStarted","Data":"aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb"} Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.607746 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.607759 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerStarted","Data":"3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680"} Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.607770 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694437 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694554 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694577 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.694599 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.695626 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.695881 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.696397 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.696557 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.719553 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") pod \"dnsmasq-dns-689df5d84f-wdn2q\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:38 crc kubenswrapper[5094]: I0220 07:06:38.905765 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.558181 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.57590325 podStartE2EDuration="4.558158075s" podCreationTimestamp="2026-02-20 07:06:35 +0000 UTC" firstStartedPulling="2026-02-20 07:06:36.862208368 +0000 UTC m=+1211.734835079" lastFinishedPulling="2026-02-20 07:06:37.844463183 +0000 UTC m=+1212.717089904" observedRunningTime="2026-02-20 07:06:38.640004676 +0000 UTC m=+1213.512631387" watchObservedRunningTime="2026-02-20 07:06:39.558158075 +0000 UTC m=+1214.430784786" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.559416 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.620430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerStarted","Data":"4f29f1584725a78d33c79c69353a3c195206f10f8b9ed911fdd59866eb9d81be"} Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.620571 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="dnsmasq-dns" containerID="cri-o://c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e" gracePeriod=10 Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.689584 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.694985 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.698651 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-5btkp" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.698784 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.698714 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.700260 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.720301 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.720494 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.720587 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.720694 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.721077 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.721294 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.752673 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823846 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823939 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.823982 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.824044 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.824494 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: E0220 07:06:39.825044 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:39 crc kubenswrapper[5094]: E0220 07:06:39.825090 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:39 crc kubenswrapper[5094]: E0220 07:06:39.825166 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:40.325134429 +0000 UTC m=+1215.197761150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.825646 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.825878 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.835418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.847161 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:39 crc kubenswrapper[5094]: I0220 07:06:39.855424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.262677 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.263879 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.281489 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.301790 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.302623 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.302847 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.334373 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:40 crc kubenswrapper[5094]: E0220 07:06:40.334578 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:40 crc kubenswrapper[5094]: E0220 07:06:40.334594 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:40 crc kubenswrapper[5094]: E0220 07:06:40.334638 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:41.33462215 +0000 UTC m=+1216.207248861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.436460 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.436612 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.436659 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.436790 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.437093 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.437176 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.437285 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540542 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540680 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540891 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.540965 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.541008 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.541066 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.541325 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.541973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.542458 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.547319 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.547773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.551310 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.558971 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") pod \"swift-ring-rebalance-vpv24\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.617370 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.628372 5094 generic.go:334] "Generic (PLEG): container finished" podID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerID="c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e" exitCode=0 Feb 20 07:06:40 crc kubenswrapper[5094]: I0220 07:06:40.628426 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerDied","Data":"c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e"} Feb 20 07:06:41 crc kubenswrapper[5094]: I0220 07:06:41.100221 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:06:41 crc kubenswrapper[5094]: W0220 07:06:41.105965 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffd170be_0f58_4016_a451_5fb1f7fd9f1b.slice/crio-1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b WatchSource:0}: Error finding container 1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b: Status 404 returned error can't find the container with id 1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b Feb 20 07:06:41 crc kubenswrapper[5094]: I0220 07:06:41.361297 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:41 crc kubenswrapper[5094]: E0220 07:06:41.362221 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:41 crc kubenswrapper[5094]: E0220 07:06:41.362240 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:41 crc kubenswrapper[5094]: E0220 07:06:41.362531 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:43.362405545 +0000 UTC m=+1218.235032256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:41 crc kubenswrapper[5094]: I0220 07:06:41.639796 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vpv24" event={"ID":"ffd170be-0f58-4016-a451-5fb1f7fd9f1b","Type":"ContainerStarted","Data":"1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b"} Feb 20 07:06:43 crc kubenswrapper[5094]: I0220 07:06:43.443961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:43 crc kubenswrapper[5094]: E0220 07:06:43.444191 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:43 crc kubenswrapper[5094]: E0220 07:06:43.444642 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:43 crc kubenswrapper[5094]: E0220 07:06:43.444724 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:47.444689825 +0000 UTC m=+1222.317316536 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.306676 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.365777 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") pod \"304622e2-8b43-4cc6-b17b-79a03bc74a58\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.365896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") pod \"304622e2-8b43-4cc6-b17b-79a03bc74a58\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.365976 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") pod \"304622e2-8b43-4cc6-b17b-79a03bc74a58\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.366148 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") pod \"304622e2-8b43-4cc6-b17b-79a03bc74a58\" (UID: \"304622e2-8b43-4cc6-b17b-79a03bc74a58\") " Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.388201 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x" (OuterVolumeSpecName: "kube-api-access-4c67x") pod "304622e2-8b43-4cc6-b17b-79a03bc74a58" (UID: "304622e2-8b43-4cc6-b17b-79a03bc74a58"). InnerVolumeSpecName "kube-api-access-4c67x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.421800 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "304622e2-8b43-4cc6-b17b-79a03bc74a58" (UID: "304622e2-8b43-4cc6-b17b-79a03bc74a58"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.439369 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "304622e2-8b43-4cc6-b17b-79a03bc74a58" (UID: "304622e2-8b43-4cc6-b17b-79a03bc74a58"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.454779 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config" (OuterVolumeSpecName: "config") pod "304622e2-8b43-4cc6-b17b-79a03bc74a58" (UID: "304622e2-8b43-4cc6-b17b-79a03bc74a58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.478976 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4c67x\" (UniqueName: \"kubernetes.io/projected/304622e2-8b43-4cc6-b17b-79a03bc74a58-kube-api-access-4c67x\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.479016 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.479026 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.479036 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/304622e2-8b43-4cc6-b17b-79a03bc74a58-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.618151 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.618211 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.638168 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.673809 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" event={"ID":"304622e2-8b43-4cc6-b17b-79a03bc74a58","Type":"ContainerDied","Data":"31a848846b08ea21777cbda10f64b475f792939e4ec24c88cd9f5397a007043b"} Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.673882 5094 scope.go:117] "RemoveContainer" containerID="c21bda5d10e0bfeb59652d25722c1c88fa06556f9418d2ba2d4cd6cd8a9af72e" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.674048 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b9c664f7-sg5jw" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.693476 5094 generic.go:334] "Generic (PLEG): container finished" podID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerID="d125587006c31c65f1eeb83ce252e5afe7d019516fe47b88f48976b518f4ec0b" exitCode=0 Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.693524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerDied","Data":"d125587006c31c65f1eeb83ce252e5afe7d019516fe47b88f48976b518f4ec0b"} Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.724078 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.725587 5094 scope.go:117] "RemoveContainer" containerID="b7767b5d4adb615e7aa00736fc2736b07d8efff1a7193f90cf7bc360858c7390" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.730258 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.735464 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b9c664f7-sg5jw"] Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.740653 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 20 07:06:44 crc kubenswrapper[5094]: I0220 07:06:44.868782 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.479525 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.707499 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerStarted","Data":"a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d"} Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.708906 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.730228 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" podStartSLOduration=7.730212941 podStartE2EDuration="7.730212941s" podCreationTimestamp="2026-02-20 07:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:45.726258126 +0000 UTC m=+1220.598884857" watchObservedRunningTime="2026-02-20 07:06:45.730212941 +0000 UTC m=+1220.602839652" Feb 20 07:06:45 crc kubenswrapper[5094]: I0220 07:06:45.858165 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" path="/var/lib/kubelet/pods/304622e2-8b43-4cc6-b17b-79a03bc74a58/volumes" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.568262 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:06:46 crc kubenswrapper[5094]: E0220 07:06:46.569384 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="init" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.569405 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="init" Feb 20 07:06:46 crc kubenswrapper[5094]: E0220 07:06:46.569470 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="dnsmasq-dns" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.569481 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="dnsmasq-dns" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.569699 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="304622e2-8b43-4cc6-b17b-79a03bc74a58" containerName="dnsmasq-dns" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.570380 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.575264 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.585781 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.633181 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.633649 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.669327 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.672372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.693878 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.734881 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.734945 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.734978 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.735031 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.736515 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.759513 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") pod \"glance-1fa0-account-create-update-8g9zk\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.837594 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.837829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.838867 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.864966 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") pod \"glance-db-create-z4m42\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " pod="openstack/glance-db-create-z4m42" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.902406 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:46 crc kubenswrapper[5094]: I0220 07:06:46.993974 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z4m42" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.217080 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.218479 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.233778 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.248447 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.248637 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.305820 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.307015 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.312539 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.315040 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.353726 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.353866 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.353985 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.354049 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.355671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.393555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") pod \"keystone-db-create-kbc28\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.455008 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.455880 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.455975 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.456116 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: E0220 07:06:47.456140 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:06:47 crc kubenswrapper[5094]: E0220 07:06:47.456504 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 20 07:06:47 crc kubenswrapper[5094]: E0220 07:06:47.456560 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift podName:b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3 nodeName:}" failed. No retries permitted until 2026-02-20 07:06:55.456540985 +0000 UTC m=+1230.329167696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift") pod "swift-storage-0" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3") : configmap "swift-ring-files" not found Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.456689 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.457745 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.473644 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.483662 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") pod \"keystone-6cc2-account-create-update-vqjjf\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.548235 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.559175 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.559253 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.588537 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.630572 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.630857 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.633956 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.638066 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.660846 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.661093 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.661223 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.661309 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.662561 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.684214 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") pod \"placement-db-create-p4nhd\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.764139 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.764296 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.765789 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.783681 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") pod \"placement-6e9d-account-create-update-f9bgk\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.821635 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:47 crc kubenswrapper[5094]: I0220 07:06:47.963315 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.337775 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.491477 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.587822 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.658542 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.673800 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.687943 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.738057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kbc28" event={"ID":"a0e18d8b-2657-4e87-b6ca-009df89bbac8","Type":"ContainerStarted","Data":"9a416b443cda054982b69d39244868053fabefa912f2d78cab0a7899918d1ec1"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.740601 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cc2-account-create-update-vqjjf" event={"ID":"876bc507-6cf2-466a-9cd3-6131a1cc590e","Type":"ContainerStarted","Data":"d597bee2674c3a780daf20b3f5ea75594b44e3bee62e1bae78efb516888a6f5b"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.742338 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vpv24" event={"ID":"ffd170be-0f58-4016-a451-5fb1f7fd9f1b","Type":"ContainerStarted","Data":"a9068c6d92101548d5c981bed911d9bb32d305a4eac5c47442863a89e8e65fe9"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.744470 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p4nhd" event={"ID":"462ace9b-51c7-4cd0-850a-65d714c5f3b6","Type":"ContainerStarted","Data":"466bc14b6e47a7f8a431b6bfab8a713c013e95d60e6dbe91267a47d164014dad"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.746750 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-8g9zk" event={"ID":"772e2155-8d29-40de-8aff-5e42112e6171","Type":"ContainerStarted","Data":"0331760d548ee8233a915834188613dcd12684d1c1089583d261fa2fb26afee8"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.749269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6e9d-account-create-update-f9bgk" event={"ID":"20ff73f2-0b55-4d81-9342-92dbe47435f0","Type":"ContainerStarted","Data":"2f8481373beadef4696472d4f06094adaa3a02416567c1059a09c95ff4c7fc9d"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.751730 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z4m42" event={"ID":"a87399a2-42e4-4f46-b93c-cd4f25594a16","Type":"ContainerStarted","Data":"75b382b7cacb24f534d4ec56b90d8492c32e928df430e4a3f769babf549e4aa5"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.751785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z4m42" event={"ID":"a87399a2-42e4-4f46-b93c-cd4f25594a16","Type":"ContainerStarted","Data":"7d549be08c997e24ddc46194864b94c1ed01c341dc44d1d9faa29c3e8fd1f0b4"} Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.766924 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vpv24" podStartSLOduration=1.9196456 podStartE2EDuration="8.766901647s" podCreationTimestamp="2026-02-20 07:06:40 +0000 UTC" firstStartedPulling="2026-02-20 07:06:41.108659188 +0000 UTC m=+1215.981285889" lastFinishedPulling="2026-02-20 07:06:47.955915215 +0000 UTC m=+1222.828541936" observedRunningTime="2026-02-20 07:06:48.76201443 +0000 UTC m=+1223.634641141" watchObservedRunningTime="2026-02-20 07:06:48.766901647 +0000 UTC m=+1223.639528348" Feb 20 07:06:48 crc kubenswrapper[5094]: I0220 07:06:48.783540 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-z4m42" podStartSLOduration=2.783524175 podStartE2EDuration="2.783524175s" podCreationTimestamp="2026-02-20 07:06:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:06:48.77785242 +0000 UTC m=+1223.650479151" watchObservedRunningTime="2026-02-20 07:06:48.783524175 +0000 UTC m=+1223.656150886" Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.775288 5094 generic.go:334] "Generic (PLEG): container finished" podID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" containerID="2a24e00dad1ec7884b816153f26b2812e819c54c4c8093cf48001992ec89df96" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.775426 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p4nhd" event={"ID":"462ace9b-51c7-4cd0-850a-65d714c5f3b6","Type":"ContainerDied","Data":"2a24e00dad1ec7884b816153f26b2812e819c54c4c8093cf48001992ec89df96"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.778861 5094 generic.go:334] "Generic (PLEG): container finished" podID="772e2155-8d29-40de-8aff-5e42112e6171" containerID="b37e6501ae2ac6b9c9a4901b1e8b894900b9c70d4214f260a2bb15e75fba5205" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.778939 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-8g9zk" event={"ID":"772e2155-8d29-40de-8aff-5e42112e6171","Type":"ContainerDied","Data":"b37e6501ae2ac6b9c9a4901b1e8b894900b9c70d4214f260a2bb15e75fba5205"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.782586 5094 generic.go:334] "Generic (PLEG): container finished" podID="20ff73f2-0b55-4d81-9342-92dbe47435f0" containerID="c8a3057121d16618bfdbd39860a04679adcd72a905e063ca9af153f1f199e6f2" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.782922 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6e9d-account-create-update-f9bgk" event={"ID":"20ff73f2-0b55-4d81-9342-92dbe47435f0","Type":"ContainerDied","Data":"c8a3057121d16618bfdbd39860a04679adcd72a905e063ca9af153f1f199e6f2"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.785845 5094 generic.go:334] "Generic (PLEG): container finished" podID="a87399a2-42e4-4f46-b93c-cd4f25594a16" containerID="75b382b7cacb24f534d4ec56b90d8492c32e928df430e4a3f769babf549e4aa5" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.786111 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z4m42" event={"ID":"a87399a2-42e4-4f46-b93c-cd4f25594a16","Type":"ContainerDied","Data":"75b382b7cacb24f534d4ec56b90d8492c32e928df430e4a3f769babf549e4aa5"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.788341 5094 generic.go:334] "Generic (PLEG): container finished" podID="876bc507-6cf2-466a-9cd3-6131a1cc590e" containerID="5fbdea48cb9017b90d8f206860f008a7a92776227fe74ba390e642bcf9bceabc" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.788568 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cc2-account-create-update-vqjjf" event={"ID":"876bc507-6cf2-466a-9cd3-6131a1cc590e","Type":"ContainerDied","Data":"5fbdea48cb9017b90d8f206860f008a7a92776227fe74ba390e642bcf9bceabc"} Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.791621 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" containerID="b09eaa7d98442981b4e7ce37eedd93a2e3a6cd66a6970eb460a847a861e69caa" exitCode=0 Feb 20 07:06:49 crc kubenswrapper[5094]: I0220 07:06:49.791834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kbc28" event={"ID":"a0e18d8b-2657-4e87-b6ca-009df89bbac8","Type":"ContainerDied","Data":"b09eaa7d98442981b4e7ce37eedd93a2e3a6cd66a6970eb460a847a861e69caa"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.359535 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z4m42" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.459047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") pod \"a87399a2-42e4-4f46-b93c-cd4f25594a16\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.459319 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") pod \"a87399a2-42e4-4f46-b93c-cd4f25594a16\" (UID: \"a87399a2-42e4-4f46-b93c-cd4f25594a16\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.460112 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a87399a2-42e4-4f46-b93c-cd4f25594a16" (UID: "a87399a2-42e4-4f46-b93c-cd4f25594a16"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.469606 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx" (OuterVolumeSpecName: "kube-api-access-km9jx") pod "a87399a2-42e4-4f46-b93c-cd4f25594a16" (UID: "a87399a2-42e4-4f46-b93c-cd4f25594a16"). InnerVolumeSpecName "kube-api-access-km9jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.562199 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a87399a2-42e4-4f46-b93c-cd4f25594a16-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.562239 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km9jx\" (UniqueName: \"kubernetes.io/projected/a87399a2-42e4-4f46-b93c-cd4f25594a16-kube-api-access-km9jx\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.565056 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.570411 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.574938 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.581878 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.596417 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671542 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") pod \"20ff73f2-0b55-4d81-9342-92dbe47435f0\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") pod \"772e2155-8d29-40de-8aff-5e42112e6171\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671750 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") pod \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671797 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") pod \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671819 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") pod \"772e2155-8d29-40de-8aff-5e42112e6171\" (UID: \"772e2155-8d29-40de-8aff-5e42112e6171\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671921 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") pod \"876bc507-6cf2-466a-9cd3-6131a1cc590e\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.671947 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") pod \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\" (UID: \"462ace9b-51c7-4cd0-850a-65d714c5f3b6\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.672017 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") pod \"20ff73f2-0b55-4d81-9342-92dbe47435f0\" (UID: \"20ff73f2-0b55-4d81-9342-92dbe47435f0\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.672085 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") pod \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\" (UID: \"a0e18d8b-2657-4e87-b6ca-009df89bbac8\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.672111 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") pod \"876bc507-6cf2-466a-9cd3-6131a1cc590e\" (UID: \"876bc507-6cf2-466a-9cd3-6131a1cc590e\") " Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.673600 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "772e2155-8d29-40de-8aff-5e42112e6171" (UID: "772e2155-8d29-40de-8aff-5e42112e6171"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.673961 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20ff73f2-0b55-4d81-9342-92dbe47435f0" (UID: "20ff73f2-0b55-4d81-9342-92dbe47435f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.674157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "462ace9b-51c7-4cd0-850a-65d714c5f3b6" (UID: "462ace9b-51c7-4cd0-850a-65d714c5f3b6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.674491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "876bc507-6cf2-466a-9cd3-6131a1cc590e" (UID: "876bc507-6cf2-466a-9cd3-6131a1cc590e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.675877 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0e18d8b-2657-4e87-b6ca-009df89bbac8" (UID: "a0e18d8b-2657-4e87-b6ca-009df89bbac8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.677571 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk" (OuterVolumeSpecName: "kube-api-access-84jkk") pod "462ace9b-51c7-4cd0-850a-65d714c5f3b6" (UID: "462ace9b-51c7-4cd0-850a-65d714c5f3b6"). InnerVolumeSpecName "kube-api-access-84jkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.678482 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt" (OuterVolumeSpecName: "kube-api-access-rf4vt") pod "772e2155-8d29-40de-8aff-5e42112e6171" (UID: "772e2155-8d29-40de-8aff-5e42112e6171"). InnerVolumeSpecName "kube-api-access-rf4vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.680623 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj" (OuterVolumeSpecName: "kube-api-access-xm2vj") pod "876bc507-6cf2-466a-9cd3-6131a1cc590e" (UID: "876bc507-6cf2-466a-9cd3-6131a1cc590e"). InnerVolumeSpecName "kube-api-access-xm2vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.682981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77" (OuterVolumeSpecName: "kube-api-access-6qh77") pod "20ff73f2-0b55-4d81-9342-92dbe47435f0" (UID: "20ff73f2-0b55-4d81-9342-92dbe47435f0"). InnerVolumeSpecName "kube-api-access-6qh77". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.685292 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp" (OuterVolumeSpecName: "kube-api-access-mqdrp") pod "a0e18d8b-2657-4e87-b6ca-009df89bbac8" (UID: "a0e18d8b-2657-4e87-b6ca-009df89bbac8"). InnerVolumeSpecName "kube-api-access-mqdrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773786 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm2vj\" (UniqueName: \"kubernetes.io/projected/876bc507-6cf2-466a-9cd3-6131a1cc590e-kube-api-access-xm2vj\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773831 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84jkk\" (UniqueName: \"kubernetes.io/projected/462ace9b-51c7-4cd0-850a-65d714c5f3b6-kube-api-access-84jkk\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773865 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20ff73f2-0b55-4d81-9342-92dbe47435f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773877 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0e18d8b-2657-4e87-b6ca-009df89bbac8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773888 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/876bc507-6cf2-466a-9cd3-6131a1cc590e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773898 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qh77\" (UniqueName: \"kubernetes.io/projected/20ff73f2-0b55-4d81-9342-92dbe47435f0-kube-api-access-6qh77\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773907 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rf4vt\" (UniqueName: \"kubernetes.io/projected/772e2155-8d29-40de-8aff-5e42112e6171-kube-api-access-rf4vt\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773918 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqdrp\" (UniqueName: \"kubernetes.io/projected/a0e18d8b-2657-4e87-b6ca-009df89bbac8-kube-api-access-mqdrp\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773927 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/462ace9b-51c7-4cd0-850a-65d714c5f3b6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.773938 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/772e2155-8d29-40de-8aff-5e42112e6171-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.818602 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-p4nhd" event={"ID":"462ace9b-51c7-4cd0-850a-65d714c5f3b6","Type":"ContainerDied","Data":"466bc14b6e47a7f8a431b6bfab8a713c013e95d60e6dbe91267a47d164014dad"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.818671 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="466bc14b6e47a7f8a431b6bfab8a713c013e95d60e6dbe91267a47d164014dad" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.818775 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-p4nhd" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.825136 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-8g9zk" event={"ID":"772e2155-8d29-40de-8aff-5e42112e6171","Type":"ContainerDied","Data":"0331760d548ee8233a915834188613dcd12684d1c1089583d261fa2fb26afee8"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.825182 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0331760d548ee8233a915834188613dcd12684d1c1089583d261fa2fb26afee8" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.825239 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-8g9zk" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.830051 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6e9d-account-create-update-f9bgk" event={"ID":"20ff73f2-0b55-4d81-9342-92dbe47435f0","Type":"ContainerDied","Data":"2f8481373beadef4696472d4f06094adaa3a02416567c1059a09c95ff4c7fc9d"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.830179 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f8481373beadef4696472d4f06094adaa3a02416567c1059a09c95ff4c7fc9d" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.830310 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6e9d-account-create-update-f9bgk" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.836185 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-z4m42" event={"ID":"a87399a2-42e4-4f46-b93c-cd4f25594a16","Type":"ContainerDied","Data":"7d549be08c997e24ddc46194864b94c1ed01c341dc44d1d9faa29c3e8fd1f0b4"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.836281 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d549be08c997e24ddc46194864b94c1ed01c341dc44d1d9faa29c3e8fd1f0b4" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.836346 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-z4m42" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.842214 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-kbc28" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.847697 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-vqjjf" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.875787 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-kbc28" event={"ID":"a0e18d8b-2657-4e87-b6ca-009df89bbac8","Type":"ContainerDied","Data":"9a416b443cda054982b69d39244868053fabefa912f2d78cab0a7899918d1ec1"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.875838 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a416b443cda054982b69d39244868053fabefa912f2d78cab0a7899918d1ec1" Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.875850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6cc2-account-create-update-vqjjf" event={"ID":"876bc507-6cf2-466a-9cd3-6131a1cc590e","Type":"ContainerDied","Data":"d597bee2674c3a780daf20b3f5ea75594b44e3bee62e1bae78efb516888a6f5b"} Feb 20 07:06:51 crc kubenswrapper[5094]: I0220 07:06:51.875865 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d597bee2674c3a780daf20b3f5ea75594b44e3bee62e1bae78efb516888a6f5b" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.244686 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.245939 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87399a2-42e4-4f46-b93c-cd4f25594a16" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.245963 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87399a2-42e4-4f46-b93c-cd4f25594a16" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246000 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876bc507-6cf2-466a-9cd3-6131a1cc590e" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246011 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="876bc507-6cf2-466a-9cd3-6131a1cc590e" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246026 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246035 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246045 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="772e2155-8d29-40de-8aff-5e42112e6171" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246054 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="772e2155-8d29-40de-8aff-5e42112e6171" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246072 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246080 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: E0220 07:06:53.246094 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ff73f2-0b55-4d81-9342-92dbe47435f0" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246102 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ff73f2-0b55-4d81-9342-92dbe47435f0" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246347 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="876bc507-6cf2-466a-9cd3-6131a1cc590e" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246430 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ff73f2-0b55-4d81-9342-92dbe47435f0" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246451 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246462 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246476 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="772e2155-8d29-40de-8aff-5e42112e6171" containerName="mariadb-account-create-update" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.246490 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87399a2-42e4-4f46-b93c-cd4f25594a16" containerName="mariadb-database-create" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.247275 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.252899 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.258792 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.317973 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.318296 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.420942 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.421679 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.423376 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.449977 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") pod \"root-account-create-update-b2w5s\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.576551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.908012 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.979066 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:53 crc kubenswrapper[5094]: I0220 07:06:53.979379 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="dnsmasq-dns" containerID="cri-o://32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" gracePeriod=10 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.102815 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.458475 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648440 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648587 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648681 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.648868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") pod \"b40815a7-cffa-44ed-8acf-98261cc7e14c\" (UID: \"b40815a7-cffa-44ed-8acf-98261cc7e14c\") " Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.665538 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8" (OuterVolumeSpecName: "kube-api-access-928v8") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "kube-api-access-928v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.698499 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.702186 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.703300 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.709563 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config" (OuterVolumeSpecName: "config") pod "b40815a7-cffa-44ed-8acf-98261cc7e14c" (UID: "b40815a7-cffa-44ed-8acf-98261cc7e14c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751746 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751784 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751797 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751815 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-928v8\" (UniqueName: \"kubernetes.io/projected/b40815a7-cffa-44ed-8acf-98261cc7e14c-kube-api-access-928v8\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.751827 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b40815a7-cffa-44ed-8acf-98261cc7e14c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.888783 5094 generic.go:334] "Generic (PLEG): container finished" podID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerID="32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" exitCode=0 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.889303 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.889145 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerDied","Data":"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.889414 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75b7bcc64f-wdrds" event={"ID":"b40815a7-cffa-44ed-8acf-98261cc7e14c","Type":"ContainerDied","Data":"ee40b807024485305f67748b89b6f21b26eeabd4f3866126d5f1e66f00f01af7"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.889458 5094 scope.go:117] "RemoveContainer" containerID="32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.891593 5094 generic.go:334] "Generic (PLEG): container finished" podID="a829c6b3-7069-4544-90dc-40ae83aba524" containerID="24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c" exitCode=0 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.891669 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerDied","Data":"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.893611 5094 generic.go:334] "Generic (PLEG): container finished" podID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" containerID="b30b7402d8eb93fe27dd4eeb5df1c58c1d66056e0ef8f55ed4b6d91fb78c16c7" exitCode=0 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.893728 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b2w5s" event={"ID":"4a26e4e5-091b-4a9e-8a16-2dc535e85fae","Type":"ContainerDied","Data":"b30b7402d8eb93fe27dd4eeb5df1c58c1d66056e0ef8f55ed4b6d91fb78c16c7"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.893764 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b2w5s" event={"ID":"4a26e4e5-091b-4a9e-8a16-2dc535e85fae","Type":"ContainerStarted","Data":"ca6817ae07db3acb9ded9895ffb37d6966aff2b9baa689ae6bec470103bb00b6"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.897168 5094 generic.go:334] "Generic (PLEG): container finished" podID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerID="0b471ed3a22d48304ba36ebcf4e6acbda6b19074d6e3a72832e1dda4c6f2f145" exitCode=0 Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.897209 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerDied","Data":"0b471ed3a22d48304ba36ebcf4e6acbda6b19074d6e3a72832e1dda4c6f2f145"} Feb 20 07:06:54 crc kubenswrapper[5094]: I0220 07:06:54.933052 5094 scope.go:117] "RemoveContainer" containerID="f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.029893 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.033571 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75b7bcc64f-wdrds"] Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.037342 5094 scope.go:117] "RemoveContainer" containerID="32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" Feb 20 07:06:55 crc kubenswrapper[5094]: E0220 07:06:55.039681 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf\": container with ID starting with 32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf not found: ID does not exist" containerID="32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.039731 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf"} err="failed to get container status \"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf\": rpc error: code = NotFound desc = could not find container \"32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf\": container with ID starting with 32e1079abe135198c0ec21ba4d5ead802416d98d3408a465053fe0c56b193faf not found: ID does not exist" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.039755 5094 scope.go:117] "RemoveContainer" containerID="f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82" Feb 20 07:06:55 crc kubenswrapper[5094]: E0220 07:06:55.040146 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82\": container with ID starting with f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82 not found: ID does not exist" containerID="f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.040166 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82"} err="failed to get container status \"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82\": rpc error: code = NotFound desc = could not find container \"f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82\": container with ID starting with f3c82a94d6648ba5c9c19bfa7de7c6f84242900741fd3a27425da31ee3d8ec82 not found: ID does not exist" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.468100 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.475510 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"swift-storage-0\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " pod="openstack/swift-storage-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.490437 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.858282 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" path="/var/lib/kubelet/pods/b40815a7-cffa-44ed-8acf-98261cc7e14c/volumes" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.911235 5094 generic.go:334] "Generic (PLEG): container finished" podID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" containerID="a9068c6d92101548d5c981bed911d9bb32d305a4eac5c47442863a89e8e65fe9" exitCode=0 Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.911418 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vpv24" event={"ID":"ffd170be-0f58-4016-a451-5fb1f7fd9f1b","Type":"ContainerDied","Data":"a9068c6d92101548d5c981bed911d9bb32d305a4eac5c47442863a89e8e65fe9"} Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.915102 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerStarted","Data":"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8"} Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.915415 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.919364 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerStarted","Data":"74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6"} Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.922127 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.964841 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.670213249 podStartE2EDuration="54.964812562s" podCreationTimestamp="2026-02-20 07:06:01 +0000 UTC" firstStartedPulling="2026-02-20 07:06:03.264813717 +0000 UTC m=+1178.137440428" lastFinishedPulling="2026-02-20 07:06:20.55941303 +0000 UTC m=+1195.432039741" observedRunningTime="2026-02-20 07:06:55.962158188 +0000 UTC m=+1230.834784929" watchObservedRunningTime="2026-02-20 07:06:55.964812562 +0000 UTC m=+1230.837439313" Feb 20 07:06:55 crc kubenswrapper[5094]: I0220 07:06:55.986668 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.484688855 podStartE2EDuration="54.986645254s" podCreationTimestamp="2026-02-20 07:06:01 +0000 UTC" firstStartedPulling="2026-02-20 07:06:04.131381231 +0000 UTC m=+1179.004007942" lastFinishedPulling="2026-02-20 07:06:20.63333763 +0000 UTC m=+1195.505964341" observedRunningTime="2026-02-20 07:06:55.98477082 +0000 UTC m=+1230.857397531" watchObservedRunningTime="2026-02-20 07:06:55.986645254 +0000 UTC m=+1230.859271965" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.153271 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.348891 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.381515 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.487304 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") pod \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.487408 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") pod \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\" (UID: \"4a26e4e5-091b-4a9e-8a16-2dc535e85fae\") " Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.488058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a26e4e5-091b-4a9e-8a16-2dc535e85fae" (UID: "4a26e4e5-091b-4a9e-8a16-2dc535e85fae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.488962 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.496892 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld" (OuterVolumeSpecName: "kube-api-access-sdcld") pod "4a26e4e5-091b-4a9e-8a16-2dc535e85fae" (UID: "4a26e4e5-091b-4a9e-8a16-2dc535e85fae"). InnerVolumeSpecName "kube-api-access-sdcld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.591226 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdcld\" (UniqueName: \"kubernetes.io/projected/4a26e4e5-091b-4a9e-8a16-2dc535e85fae-kube-api-access-sdcld\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.846946 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:06:56 crc kubenswrapper[5094]: E0220 07:06:56.847362 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="dnsmasq-dns" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847378 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="dnsmasq-dns" Feb 20 07:06:56 crc kubenswrapper[5094]: E0220 07:06:56.847394 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="init" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847401 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="init" Feb 20 07:06:56 crc kubenswrapper[5094]: E0220 07:06:56.847413 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" containerName="mariadb-account-create-update" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847419 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" containerName="mariadb-account-create-update" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847593 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" containerName="mariadb-account-create-update" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.847612 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b40815a7-cffa-44ed-8acf-98261cc7e14c" containerName="dnsmasq-dns" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.848256 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.851847 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.852079 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9q5bq" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.859477 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.932651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"b318a2984af52699dbcc87bf8935047ececfd11a736630826d02b012b12ef5e4"} Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.944439 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b2w5s" event={"ID":"4a26e4e5-091b-4a9e-8a16-2dc535e85fae","Type":"ContainerDied","Data":"ca6817ae07db3acb9ded9895ffb37d6966aff2b9baa689ae6bec470103bb00b6"} Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.944766 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b2w5s" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.944853 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca6817ae07db3acb9ded9895ffb37d6966aff2b9baa689ae6bec470103bb00b6" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.997777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.998133 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.998276 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:56 crc kubenswrapper[5094]: I0220 07:06:56.998448 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.103838 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.104413 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.104544 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.104630 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.111726 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.126590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.128075 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.132551 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") pod \"glance-db-sync-d27ft\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.168401 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d27ft" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.379267 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510186 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510636 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510725 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510837 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510878 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.510974 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") pod \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\" (UID: \"ffd170be-0f58-4016-a451-5fb1f7fd9f1b\") " Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.511262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.511858 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.521035 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6" (OuterVolumeSpecName: "kube-api-access-cjgt6") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "kube-api-access-cjgt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.523955 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.581615 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts" (OuterVolumeSpecName: "scripts") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.591899 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.602063 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ffd170be-0f58-4016-a451-5fb1f7fd9f1b" (UID: "ffd170be-0f58-4016-a451-5fb1f7fd9f1b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.612968 5094 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613000 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613014 5094 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613025 5094 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613036 5094 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613045 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjgt6\" (UniqueName: \"kubernetes.io/projected/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-kube-api-access-cjgt6\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.613057 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffd170be-0f58-4016-a451-5fb1f7fd9f1b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.814891 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.955117 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9"} Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.955659 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c"} Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.957374 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vpv24" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.957371 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vpv24" event={"ID":"ffd170be-0f58-4016-a451-5fb1f7fd9f1b","Type":"ContainerDied","Data":"1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b"} Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.957544 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1204be4a8d2c968ca778e1391eab76ffdababc3edfb0acee83669813b41f3e1b" Feb 20 07:06:57 crc kubenswrapper[5094]: I0220 07:06:57.958440 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d27ft" event={"ID":"2538e2cc-781b-4c2a-b993-381e488fd5bb","Type":"ContainerStarted","Data":"178c70ca4808580e5184d1f4d0b6c895bc6f3ba07b300a63f8cb105cddd95d66"} Feb 20 07:06:58 crc kubenswrapper[5094]: I0220 07:06:58.979563 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b"} Feb 20 07:06:58 crc kubenswrapper[5094]: I0220 07:06:58.981213 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a"} Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.652360 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.657879 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-b2w5s"] Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.850657 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a26e4e5-091b-4a9e-8a16-2dc535e85fae" path="/var/lib/kubelet/pods/4a26e4e5-091b-4a9e-8a16-2dc535e85fae/volumes" Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.991503 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813"} Feb 20 07:06:59 crc kubenswrapper[5094]: I0220 07:06:59.991548 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53"} Feb 20 07:07:01 crc kubenswrapper[5094]: I0220 07:07:01.006941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3"} Feb 20 07:07:01 crc kubenswrapper[5094]: I0220 07:07:01.007430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e"} Feb 20 07:07:01 crc kubenswrapper[5094]: I0220 07:07:01.889078 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" probeResult="failure" output=< Feb 20 07:07:01 crc kubenswrapper[5094]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 07:07:01 crc kubenswrapper[5094]: > Feb 20 07:07:02 crc kubenswrapper[5094]: I0220 07:07:02.053143 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c"} Feb 20 07:07:02 crc kubenswrapper[5094]: I0220 07:07:02.053223 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d"} Feb 20 07:07:02 crc kubenswrapper[5094]: I0220 07:07:02.053251 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2"} Feb 20 07:07:02 crc kubenswrapper[5094]: I0220 07:07:02.053275 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601"} Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.078783 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991"} Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.079445 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76"} Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.079460 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerStarted","Data":"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce"} Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.122394 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.158802266 podStartE2EDuration="25.1223741s" podCreationTimestamp="2026-02-20 07:06:38 +0000 UTC" firstStartedPulling="2026-02-20 07:06:56.157199239 +0000 UTC m=+1231.029825950" lastFinishedPulling="2026-02-20 07:07:01.120771083 +0000 UTC m=+1235.993397784" observedRunningTime="2026-02-20 07:07:03.120964516 +0000 UTC m=+1237.993591227" watchObservedRunningTime="2026-02-20 07:07:03.1223741 +0000 UTC m=+1237.995000811" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.453969 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:03 crc kubenswrapper[5094]: E0220 07:07:03.454647 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" containerName="swift-ring-rebalance" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.454685 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" containerName="swift-ring-rebalance" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.455044 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" containerName="swift-ring-rebalance" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.456625 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.461987 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.469262 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532441 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532520 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532553 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532577 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532646 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.532712 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634611 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634729 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634771 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634813 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634884 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.634909 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636283 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636302 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636361 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.636945 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.664996 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") pod \"dnsmasq-dns-768666cd57-7ddwb\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:03 crc kubenswrapper[5094]: I0220 07:07:03.798358 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.109055 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.109133 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.340155 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.663852 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.664884 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.667411 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.688652 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.767463 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.767723 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.869753 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.869828 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.871136 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.889825 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") pod \"root-account-create-update-b77jp\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:04 crc kubenswrapper[5094]: I0220 07:07:04.984388 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:06 crc kubenswrapper[5094]: I0220 07:07:06.912416 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" probeResult="failure" output=< Feb 20 07:07:06 crc kubenswrapper[5094]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 07:07:06 crc kubenswrapper[5094]: > Feb 20 07:07:06 crc kubenswrapper[5094]: I0220 07:07:06.981187 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:07:06 crc kubenswrapper[5094]: I0220 07:07:06.981732 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.250455 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.252774 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.255113 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.266159 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318473 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318580 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318643 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318741 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.318778 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421127 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421194 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421236 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421259 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421307 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421521 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.421673 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.423038 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.424154 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.443795 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") pod \"ovn-controller-lvlr2-config-wmdb5\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:07 crc kubenswrapper[5094]: I0220 07:07:07.589657 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:11 crc kubenswrapper[5094]: W0220 07:07:11.675575 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa0f7da7_a9bd_4b03_b256_d05ba9323e70.slice/crio-a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa WatchSource:0}: Error finding container a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa: Status 404 returned error can't find the container with id a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa Feb 20 07:07:11 crc kubenswrapper[5094]: I0220 07:07:11.879752 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" probeResult="failure" output=< Feb 20 07:07:11 crc kubenswrapper[5094]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 20 07:07:11 crc kubenswrapper[5094]: > Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.169752 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerID="7c8241aa612d986c2efd3e576b0082b1361568858d2a7098f35d783948c494f3" exitCode=0 Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.169806 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerDied","Data":"7c8241aa612d986c2efd3e576b0082b1361568858d2a7098f35d783948c494f3"} Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.169837 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerStarted","Data":"a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa"} Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.255593 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.263909 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:12 crc kubenswrapper[5094]: W0220 07:07:12.271120 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a516b45_9a5d_4210_82b3_b07e7251ffad.slice/crio-1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3 WatchSource:0}: Error finding container 1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3: Status 404 returned error can't find the container with id 1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3 Feb 20 07:07:12 crc kubenswrapper[5094]: W0220 07:07:12.274952 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5044f3da_a9aa_4f6e_b598_3b5e963f8731.slice/crio-1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185 WatchSource:0}: Error finding container 1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185: Status 404 returned error can't find the container with id 1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185 Feb 20 07:07:12 crc kubenswrapper[5094]: I0220 07:07:12.647950 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.179827 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d27ft" event={"ID":"2538e2cc-781b-4c2a-b993-381e488fd5bb","Type":"ContainerStarted","Data":"faa3a4c7f983e78d891220a937038d927c9ef5f317cdc8774b24154cf98f3466"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.185498 5094 generic.go:334] "Generic (PLEG): container finished" podID="3a516b45-9a5d-4210-82b3-b07e7251ffad" containerID="ba5029a86f52015ae26ae9c4af241df191f71a5df81010d7bab393d3d450c913" exitCode=0 Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.185639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wmdb5" event={"ID":"3a516b45-9a5d-4210-82b3-b07e7251ffad","Type":"ContainerDied","Data":"ba5029a86f52015ae26ae9c4af241df191f71a5df81010d7bab393d3d450c913"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.185673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wmdb5" event={"ID":"3a516b45-9a5d-4210-82b3-b07e7251ffad","Type":"ContainerStarted","Data":"1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.189139 5094 generic.go:334] "Generic (PLEG): container finished" podID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" containerID="59cc73fd6408558710efa6324658cf301b0cc15eed3c78c0c37707c5d008b54e" exitCode=0 Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.189252 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b77jp" event={"ID":"5044f3da-a9aa-4f6e-b598-3b5e963f8731","Type":"ContainerDied","Data":"59cc73fd6408558710efa6324658cf301b0cc15eed3c78c0c37707c5d008b54e"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.189301 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b77jp" event={"ID":"5044f3da-a9aa-4f6e-b598-3b5e963f8731","Type":"ContainerStarted","Data":"1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.192612 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerStarted","Data":"0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6"} Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.192978 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.213100 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-d27ft" podStartSLOduration=3.190560736 podStartE2EDuration="17.213059894s" podCreationTimestamp="2026-02-20 07:06:56 +0000 UTC" firstStartedPulling="2026-02-20 07:06:57.828585008 +0000 UTC m=+1232.701211719" lastFinishedPulling="2026-02-20 07:07:11.851084166 +0000 UTC m=+1246.723710877" observedRunningTime="2026-02-20 07:07:13.200621236 +0000 UTC m=+1248.073247977" watchObservedRunningTime="2026-02-20 07:07:13.213059894 +0000 UTC m=+1248.085686635" Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.232294 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podStartSLOduration=10.232261624 podStartE2EDuration="10.232261624s" podCreationTimestamp="2026-02-20 07:07:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:13.227689304 +0000 UTC m=+1248.100316025" watchObservedRunningTime="2026-02-20 07:07:13.232261624 +0000 UTC m=+1248.104888375" Feb 20 07:07:13 crc kubenswrapper[5094]: I0220 07:07:13.549950 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.612611 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.694831 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712445 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712519 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712559 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712606 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.712759 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") pod \"3a516b45-9a5d-4210-82b3-b07e7251ffad\" (UID: \"3a516b45-9a5d-4210-82b3-b07e7251ffad\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.713838 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.713884 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.713987 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run" (OuterVolumeSpecName: "var-run") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.715063 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts" (OuterVolumeSpecName: "scripts") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.715146 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.736593 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb" (OuterVolumeSpecName: "kube-api-access-bpdmb") pod "3a516b45-9a5d-4210-82b3-b07e7251ffad" (UID: "3a516b45-9a5d-4210-82b3-b07e7251ffad"). InnerVolumeSpecName "kube-api-access-bpdmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") pod \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814339 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") pod \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\" (UID: \"5044f3da-a9aa-4f6e-b598-3b5e963f8731\") " Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814817 5094 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814836 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a516b45-9a5d-4210-82b3-b07e7251ffad-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814847 5094 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814857 5094 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814866 5094 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3a516b45-9a5d-4210-82b3-b07e7251ffad-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.814879 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpdmb\" (UniqueName: \"kubernetes.io/projected/3a516b45-9a5d-4210-82b3-b07e7251ffad-kube-api-access-bpdmb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.815576 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5044f3da-a9aa-4f6e-b598-3b5e963f8731" (UID: "5044f3da-a9aa-4f6e-b598-3b5e963f8731"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.821250 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw" (OuterVolumeSpecName: "kube-api-access-hstcw") pod "5044f3da-a9aa-4f6e-b598-3b5e963f8731" (UID: "5044f3da-a9aa-4f6e-b598-3b5e963f8731"). InnerVolumeSpecName "kube-api-access-hstcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.854220 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:07:14 crc kubenswrapper[5094]: E0220 07:07:14.855379 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" containerName="mariadb-account-create-update" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.855405 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" containerName="mariadb-account-create-update" Feb 20 07:07:14 crc kubenswrapper[5094]: E0220 07:07:14.856080 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a516b45-9a5d-4210-82b3-b07e7251ffad" containerName="ovn-config" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.856131 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a516b45-9a5d-4210-82b3-b07e7251ffad" containerName="ovn-config" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.856520 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a516b45-9a5d-4210-82b3-b07e7251ffad" containerName="ovn-config" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.856547 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" containerName="mariadb-account-create-update" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.857513 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.870827 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.916695 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.916770 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.916877 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5044f3da-a9aa-4f6e-b598-3b5e963f8731-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.916893 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hstcw\" (UniqueName: \"kubernetes.io/projected/5044f3da-a9aa-4f6e-b598-3b5e963f8731-kube-api-access-hstcw\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.955542 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.956726 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.960014 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 20 07:07:14 crc kubenswrapper[5094]: I0220 07:07:14.971810 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.018503 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.018567 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.018652 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.018777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.019439 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.046267 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.047546 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.055905 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") pod \"cinder-db-create-7gn4d\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.062939 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.105751 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.106871 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.109372 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.109596 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qgvjt" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.109850 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.109980 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.120737 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.120818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.120877 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.120951 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.121767 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.129809 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.150890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") pod \"cinder-8083-account-create-update-wxrzd\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.150994 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.186163 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.188100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.189358 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.215652 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.223933 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224018 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224046 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224072 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224132 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224156 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.224195 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.225167 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.228188 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-b77jp" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.230271 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-b77jp" event={"ID":"5044f3da-a9aa-4f6e-b598-3b5e963f8731","Type":"ContainerDied","Data":"1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185"} Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.230338 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b450bac989c189372bce33651ac118a9439313a250ef763906a77f1ed376185" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.237034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wmdb5" event={"ID":"3a516b45-9a5d-4210-82b3-b07e7251ffad","Type":"ContainerDied","Data":"1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3"} Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.237081 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ead8b8ccd74a6ebc513af1497b9a70035d11b41919035393542a1a8cea701c3" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.237167 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wmdb5" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.247732 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") pod \"barbican-db-create-qvr99\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.250089 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.251747 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.258725 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.274922 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325674 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325754 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325805 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325840 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325917 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.325960 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.327299 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.329888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.331286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.360805 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") pod \"keystone-db-sync-plbtm\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.366116 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.371487 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") pod \"barbican-c179-account-create-update-sst4m\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.382228 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.385989 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.390568 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.412557 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.422185 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.427688 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.427751 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.427807 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.427878 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.428551 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.450321 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") pod \"neutron-db-create-qqgpn\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.530305 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.530420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.531467 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.531551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.558479 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") pod \"neutron-abcd-account-create-update-bwsmr\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.579737 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.724164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.816116 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.828256 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wmdb5"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.857835 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a516b45-9a5d-4210-82b3-b07e7251ffad" path="/var/lib/kubelet/pods/3a516b45-9a5d-4210-82b3-b07e7251ffad/volumes" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.858640 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.947239 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.948783 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.956934 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.972329 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:15 crc kubenswrapper[5094]: I0220 07:07:15.986616 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.044957 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045025 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045092 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045122 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045177 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.045210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.095905 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147558 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147654 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147689 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147776 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.147824 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.148682 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.149469 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.149517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.149555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.150890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.172831 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") pod \"ovn-controller-lvlr2-config-wsgcd\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: W0220 07:07:16.205370 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod403a4371_09f4_4206_8d60_5b970d7e4faf.slice/crio-73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98 WatchSource:0}: Error finding container 73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98: Status 404 returned error can't find the container with id 73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98 Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.205723 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.239527 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:07:16 crc kubenswrapper[5094]: W0220 07:07:16.243660 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4920eee_8485_4faa_892c_893c6466a90c.slice/crio-d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78 WatchSource:0}: Error finding container d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78: Status 404 returned error can't find the container with id d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78 Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.253297 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-plbtm" event={"ID":"403a4371-09f4-4206-8d60-5b970d7e4faf","Type":"ContainerStarted","Data":"73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.257460 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8083-account-create-update-wxrzd" event={"ID":"23a44809-2f91-4dbe-80ed-733390b037d8","Type":"ContainerStarted","Data":"07473375b1389dea5ef061588ec041b7a1dc6c39e615f25c439adbc0f20ff01b"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.259759 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qvr99" event={"ID":"5c0f5daa-28f1-412d-8749-5b11f6b8f26d","Type":"ContainerStarted","Data":"98316dffcd57fbd3d1bd86b87cf77b8f71de4c56ee7fbdaff700987546968d91"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.261404 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gn4d" event={"ID":"3d59abb8-e7c7-404f-8f03-13d2167bea54","Type":"ContainerStarted","Data":"627e60d9c9677f6a945a638d36a6e55c653ad3df560faf93d76ad3fae2820334"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.261430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gn4d" event={"ID":"3d59abb8-e7c7-404f-8f03-13d2167bea54","Type":"ContainerStarted","Data":"6c0583540eb15668853e966ba4c6edb87e4fe6d9a3115b84ce82c03eaf755780"} Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.279249 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-7gn4d" podStartSLOduration=2.279225177 podStartE2EDuration="2.279225177s" podCreationTimestamp="2026-02-20 07:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:16.273078649 +0000 UTC m=+1251.145705360" watchObservedRunningTime="2026-02-20 07:07:16.279225177 +0000 UTC m=+1251.151851888" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.287899 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.372647 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.383920 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.871213 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-lvlr2" Feb 20 07:07:16 crc kubenswrapper[5094]: E0220 07:07:16.906432 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a44809_2f91_4dbe_80ed_733390b037d8.slice/crio-conmon-652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23a44809_2f91_4dbe_80ed_733390b037d8.slice/crio-652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575.scope\": RecentStats: unable to find data in memory cache]" Feb 20 07:07:16 crc kubenswrapper[5094]: I0220 07:07:16.960438 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:17 crc kubenswrapper[5094]: W0220 07:07:17.020835 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00e11219_ecb3_45ab_8303_265d85ff4c3a.slice/crio-a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4 WatchSource:0}: Error finding container a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4: Status 404 returned error can't find the container with id a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.271103 5094 generic.go:334] "Generic (PLEG): container finished" podID="75a27624-eac7-47c9-9f3b-98604d88fb3a" containerID="378b26e1e0650ae576632665d611910465c17369e442435b9765cd97f7bbf4b7" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.271180 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c179-account-create-update-sst4m" event={"ID":"75a27624-eac7-47c9-9f3b-98604d88fb3a","Type":"ContainerDied","Data":"378b26e1e0650ae576632665d611910465c17369e442435b9765cd97f7bbf4b7"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.271214 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c179-account-create-update-sst4m" event={"ID":"75a27624-eac7-47c9-9f3b-98604d88fb3a","Type":"ContainerStarted","Data":"524ab1f80a7901ac7889c8ffcacb992647f00b7dc710ad8fede539905cd58f26"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.273154 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d59abb8-e7c7-404f-8f03-13d2167bea54" containerID="627e60d9c9677f6a945a638d36a6e55c653ad3df560faf93d76ad3fae2820334" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.273195 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gn4d" event={"ID":"3d59abb8-e7c7-404f-8f03-13d2167bea54","Type":"ContainerDied","Data":"627e60d9c9677f6a945a638d36a6e55c653ad3df560faf93d76ad3fae2820334"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.274592 5094 generic.go:334] "Generic (PLEG): container finished" podID="317d32d8-9ad2-4bd1-87f4-745e3157c713" containerID="50b02908599fab0b56ac49b8dfc4de2ac6a680f5927a195974880e894fd05f07" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.274634 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qqgpn" event={"ID":"317d32d8-9ad2-4bd1-87f4-745e3157c713","Type":"ContainerDied","Data":"50b02908599fab0b56ac49b8dfc4de2ac6a680f5927a195974880e894fd05f07"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.274650 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qqgpn" event={"ID":"317d32d8-9ad2-4bd1-87f4-745e3157c713","Type":"ContainerStarted","Data":"33c2a7ece5b891e28740e85e198816b9f0507288fb0d7eaec7b58b5561691817"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.276509 5094 generic.go:334] "Generic (PLEG): container finished" podID="23a44809-2f91-4dbe-80ed-733390b037d8" containerID="652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.276550 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8083-account-create-update-wxrzd" event={"ID":"23a44809-2f91-4dbe-80ed-733390b037d8","Type":"ContainerDied","Data":"652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.278020 5094 generic.go:334] "Generic (PLEG): container finished" podID="c4920eee-8485-4faa-892c-893c6466a90c" containerID="484f3fb839183cec10038487f86ef12f28aad48e989d27e0f371b4836997c9c1" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.278054 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-abcd-account-create-update-bwsmr" event={"ID":"c4920eee-8485-4faa-892c-893c6466a90c","Type":"ContainerDied","Data":"484f3fb839183cec10038487f86ef12f28aad48e989d27e0f371b4836997c9c1"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.278068 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-abcd-account-create-update-bwsmr" event={"ID":"c4920eee-8485-4faa-892c-893c6466a90c","Type":"ContainerStarted","Data":"d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.281216 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" containerID="cef36671b09afd9d82ea9087a220ba378848b8caf63bdb34c5ff82372929ee6f" exitCode=0 Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.281260 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qvr99" event={"ID":"5c0f5daa-28f1-412d-8749-5b11f6b8f26d","Type":"ContainerDied","Data":"cef36671b09afd9d82ea9087a220ba378848b8caf63bdb34c5ff82372929ee6f"} Feb 20 07:07:17 crc kubenswrapper[5094]: I0220 07:07:17.283439 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wsgcd" event={"ID":"00e11219-ecb3-45ab-8303-265d85ff4c3a","Type":"ContainerStarted","Data":"a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4"} Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.295980 5094 generic.go:334] "Generic (PLEG): container finished" podID="00e11219-ecb3-45ab-8303-265d85ff4c3a" containerID="d9a07c98406e23d72c5a2bc3d04e8964b30bc89dab757f6e64abbd3de62c1272" exitCode=0 Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.296092 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wsgcd" event={"ID":"00e11219-ecb3-45ab-8303-265d85ff4c3a","Type":"ContainerDied","Data":"d9a07c98406e23d72c5a2bc3d04e8964b30bc89dab757f6e64abbd3de62c1272"} Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.799902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.852036 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.852292 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" containerID="cri-o://a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d" gracePeriod=10 Feb 20 07:07:18 crc kubenswrapper[5094]: I0220 07:07:18.906172 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Feb 20 07:07:19 crc kubenswrapper[5094]: I0220 07:07:19.309127 5094 generic.go:334] "Generic (PLEG): container finished" podID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerID="a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d" exitCode=0 Feb 20 07:07:19 crc kubenswrapper[5094]: I0220 07:07:19.309192 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerDied","Data":"a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d"} Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.339441 5094 generic.go:334] "Generic (PLEG): container finished" podID="2538e2cc-781b-4c2a-b993-381e488fd5bb" containerID="faa3a4c7f983e78d891220a937038d927c9ef5f317cdc8774b24154cf98f3466" exitCode=0 Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.339528 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d27ft" event={"ID":"2538e2cc-781b-4c2a-b993-381e488fd5bb","Type":"ContainerDied","Data":"faa3a4c7f983e78d891220a937038d927c9ef5f317cdc8774b24154cf98f3466"} Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.348011 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qvr99" event={"ID":"5c0f5daa-28f1-412d-8749-5b11f6b8f26d","Type":"ContainerDied","Data":"98316dffcd57fbd3d1bd86b87cf77b8f71de4c56ee7fbdaff700987546968d91"} Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.348089 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98316dffcd57fbd3d1bd86b87cf77b8f71de4c56ee7fbdaff700987546968d91" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.622762 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.630575 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.636326 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.654110 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.690345 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") pod \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702313 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") pod \"c4920eee-8485-4faa-892c-893c6466a90c\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702384 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") pod \"c4920eee-8485-4faa-892c-893c6466a90c\" (UID: \"c4920eee-8485-4faa-892c-893c6466a90c\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") pod \"317d32d8-9ad2-4bd1-87f4-745e3157c713\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702444 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") pod \"317d32d8-9ad2-4bd1-87f4-745e3157c713\" (UID: \"317d32d8-9ad2-4bd1-87f4-745e3157c713\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702489 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") pod \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\" (UID: \"5c0f5daa-28f1-412d-8749-5b11f6b8f26d\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702556 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") pod \"3d59abb8-e7c7-404f-8f03-13d2167bea54\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.702684 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") pod \"3d59abb8-e7c7-404f-8f03-13d2167bea54\" (UID: \"3d59abb8-e7c7-404f-8f03-13d2167bea54\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.703643 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4920eee-8485-4faa-892c-893c6466a90c" (UID: "c4920eee-8485-4faa-892c-893c6466a90c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.703966 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5c0f5daa-28f1-412d-8749-5b11f6b8f26d" (UID: "5c0f5daa-28f1-412d-8749-5b11f6b8f26d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.704093 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4920eee-8485-4faa-892c-893c6466a90c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.704546 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "317d32d8-9ad2-4bd1-87f4-745e3157c713" (UID: "317d32d8-9ad2-4bd1-87f4-745e3157c713"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.704673 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d59abb8-e7c7-404f-8f03-13d2167bea54" (UID: "3d59abb8-e7c7-404f-8f03-13d2167bea54"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.709912 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs" (OuterVolumeSpecName: "kube-api-access-4ztfs") pod "c4920eee-8485-4faa-892c-893c6466a90c" (UID: "c4920eee-8485-4faa-892c-893c6466a90c"). InnerVolumeSpecName "kube-api-access-4ztfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.709953 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt" (OuterVolumeSpecName: "kube-api-access-gvlpt") pod "3d59abb8-e7c7-404f-8f03-13d2167bea54" (UID: "3d59abb8-e7c7-404f-8f03-13d2167bea54"). InnerVolumeSpecName "kube-api-access-gvlpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.711184 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.712346 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l" (OuterVolumeSpecName: "kube-api-access-rnz6l") pod "317d32d8-9ad2-4bd1-87f4-745e3157c713" (UID: "317d32d8-9ad2-4bd1-87f4-745e3157c713"). InnerVolumeSpecName "kube-api-access-rnz6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.722131 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx" (OuterVolumeSpecName: "kube-api-access-zxrcx") pod "5c0f5daa-28f1-412d-8749-5b11f6b8f26d" (UID: "5c0f5daa-28f1-412d-8749-5b11f6b8f26d"). InnerVolumeSpecName "kube-api-access-zxrcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.735068 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.740361 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.804952 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805001 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805077 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") pod \"75a27624-eac7-47c9-9f3b-98604d88fb3a\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805144 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805171 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805225 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805259 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805288 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") pod \"53d83e89-d39a-4ed6-ab65-02820d089bec\" (UID: \"53d83e89-d39a-4ed6-ab65-02820d089bec\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805288 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run" (OuterVolumeSpecName: "var-run") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805334 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805404 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") pod \"23a44809-2f91-4dbe-80ed-733390b037d8\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805452 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") pod \"23a44809-2f91-4dbe-80ed-733390b037d8\" (UID: \"23a44809-2f91-4dbe-80ed-733390b037d8\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805480 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805497 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805516 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") pod \"75a27624-eac7-47c9-9f3b-98604d88fb3a\" (UID: \"75a27624-eac7-47c9-9f3b-98604d88fb3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805548 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") pod \"00e11219-ecb3-45ab-8303-265d85ff4c3a\" (UID: \"00e11219-ecb3-45ab-8303-265d85ff4c3a\") " Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805850 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805927 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ztfs\" (UniqueName: \"kubernetes.io/projected/c4920eee-8485-4faa-892c-893c6466a90c-kube-api-access-4ztfs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805941 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnz6l\" (UniqueName: \"kubernetes.io/projected/317d32d8-9ad2-4bd1-87f4-745e3157c713-kube-api-access-rnz6l\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805951 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/317d32d8-9ad2-4bd1-87f4-745e3157c713-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805963 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxrcx\" (UniqueName: \"kubernetes.io/projected/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-kube-api-access-zxrcx\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805973 5094 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805985 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d59abb8-e7c7-404f-8f03-13d2167bea54-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.805994 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvlpt\" (UniqueName: \"kubernetes.io/projected/3d59abb8-e7c7-404f-8f03-13d2167bea54-kube-api-access-gvlpt\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.806004 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5c0f5daa-28f1-412d-8749-5b11f6b8f26d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.806846 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "23a44809-2f91-4dbe-80ed-733390b037d8" (UID: "23a44809-2f91-4dbe-80ed-733390b037d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.806913 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.806999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.807825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts" (OuterVolumeSpecName: "scripts") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.808408 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75a27624-eac7-47c9-9f3b-98604d88fb3a" (UID: "75a27624-eac7-47c9-9f3b-98604d88fb3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.810713 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2" (OuterVolumeSpecName: "kube-api-access-vrtl2") pod "23a44809-2f91-4dbe-80ed-733390b037d8" (UID: "23a44809-2f91-4dbe-80ed-733390b037d8"). InnerVolumeSpecName "kube-api-access-vrtl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.813535 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9" (OuterVolumeSpecName: "kube-api-access-mzbc9") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "kube-api-access-mzbc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.813768 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n" (OuterVolumeSpecName: "kube-api-access-99x5n") pod "75a27624-eac7-47c9-9f3b-98604d88fb3a" (UID: "75a27624-eac7-47c9-9f3b-98604d88fb3a"). InnerVolumeSpecName "kube-api-access-99x5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.816523 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw" (OuterVolumeSpecName: "kube-api-access-jb7kw") pod "00e11219-ecb3-45ab-8303-265d85ff4c3a" (UID: "00e11219-ecb3-45ab-8303-265d85ff4c3a"). InnerVolumeSpecName "kube-api-access-jb7kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.858475 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.859683 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.884154 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config" (OuterVolumeSpecName: "config") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.890117 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53d83e89-d39a-4ed6-ab65-02820d089bec" (UID: "53d83e89-d39a-4ed6-ab65-02820d089bec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908663 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908693 5094 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908728 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrtl2\" (UniqueName: \"kubernetes.io/projected/23a44809-2f91-4dbe-80ed-733390b037d8-kube-api-access-vrtl2\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908745 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23a44809-2f91-4dbe-80ed-733390b037d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908759 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00e11219-ecb3-45ab-8303-265d85ff4c3a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908775 5094 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908790 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75a27624-eac7-47c9-9f3b-98604d88fb3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908806 5094 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00e11219-ecb3-45ab-8303-265d85ff4c3a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908819 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908831 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99x5n\" (UniqueName: \"kubernetes.io/projected/75a27624-eac7-47c9-9f3b-98604d88fb3a-kube-api-access-99x5n\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908844 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908857 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53d83e89-d39a-4ed6-ab65-02820d089bec-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908870 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb7kw\" (UniqueName: \"kubernetes.io/projected/00e11219-ecb3-45ab-8303-265d85ff4c3a-kube-api-access-jb7kw\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:21 crc kubenswrapper[5094]: I0220 07:07:21.908889 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzbc9\" (UniqueName: \"kubernetes.io/projected/53d83e89-d39a-4ed6-ab65-02820d089bec-kube-api-access-mzbc9\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.386770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-plbtm" event={"ID":"403a4371-09f4-4206-8d60-5b970d7e4faf","Type":"ContainerStarted","Data":"f18af370a6085a561175edbcf861dea18a9b31989555b80d968ac0b83530959d"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.392026 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8083-account-create-update-wxrzd" event={"ID":"23a44809-2f91-4dbe-80ed-733390b037d8","Type":"ContainerDied","Data":"07473375b1389dea5ef061588ec041b7a1dc6c39e615f25c439adbc0f20ff01b"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.392092 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07473375b1389dea5ef061588ec041b7a1dc6c39e615f25c439adbc0f20ff01b" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.395871 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8083-account-create-update-wxrzd" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.415193 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-abcd-account-create-update-bwsmr" event={"ID":"c4920eee-8485-4faa-892c-893c6466a90c","Type":"ContainerDied","Data":"d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.415262 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d53251aa996c65ff3c2062ffb8b3114ee2a3d8ca7e99e1d83319a526e535bd78" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.415413 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-abcd-account-create-update-bwsmr" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.418377 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-plbtm" podStartSLOduration=2.179566529 podStartE2EDuration="7.418350324s" podCreationTimestamp="2026-02-20 07:07:15 +0000 UTC" firstStartedPulling="2026-02-20 07:07:16.208940953 +0000 UTC m=+1251.081567664" lastFinishedPulling="2026-02-20 07:07:21.447724758 +0000 UTC m=+1256.320351459" observedRunningTime="2026-02-20 07:07:22.415917016 +0000 UTC m=+1257.288543737" watchObservedRunningTime="2026-02-20 07:07:22.418350324 +0000 UTC m=+1257.290977045" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.424191 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" event={"ID":"53d83e89-d39a-4ed6-ab65-02820d089bec","Type":"ContainerDied","Data":"4f29f1584725a78d33c79c69353a3c195206f10f8b9ed911fdd59866eb9d81be"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.424226 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-689df5d84f-wdn2q" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.424298 5094 scope.go:117] "RemoveContainer" containerID="a4bd321aef47fd2509dc250944b4a0aee8240092ee8ccb473ab0a8a33aee200d" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.438756 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2-config-wsgcd" event={"ID":"00e11219-ecb3-45ab-8303-265d85ff4c3a","Type":"ContainerDied","Data":"a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.438799 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83d0092220037210ddac94356a5643eedd8b0154c0d0c4581dd1f1cab42ebc4" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.438888 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2-config-wsgcd" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.441330 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c179-account-create-update-sst4m" event={"ID":"75a27624-eac7-47c9-9f3b-98604d88fb3a","Type":"ContainerDied","Data":"524ab1f80a7901ac7889c8ffcacb992647f00b7dc710ad8fede539905cd58f26"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.441350 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="524ab1f80a7901ac7889c8ffcacb992647f00b7dc710ad8fede539905cd58f26" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.441395 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c179-account-create-update-sst4m" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.444134 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-7gn4d" event={"ID":"3d59abb8-e7c7-404f-8f03-13d2167bea54","Type":"ContainerDied","Data":"6c0583540eb15668853e966ba4c6edb87e4fe6d9a3115b84ce82c03eaf755780"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.444159 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c0583540eb15668853e966ba4c6edb87e4fe6d9a3115b84ce82c03eaf755780" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.444302 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-7gn4d" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.447830 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-qqgpn" event={"ID":"317d32d8-9ad2-4bd1-87f4-745e3157c713","Type":"ContainerDied","Data":"33c2a7ece5b891e28740e85e198816b9f0507288fb0d7eaec7b58b5561691817"} Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.447974 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33c2a7ece5b891e28740e85e198816b9f0507288fb0d7eaec7b58b5561691817" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.447940 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-qqgpn" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.447878 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qvr99" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.470980 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.478462 5094 scope.go:117] "RemoveContainer" containerID="d125587006c31c65f1eeb83ce252e5afe7d019516fe47b88f48976b518f4ec0b" Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.479236 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-689df5d84f-wdn2q"] Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.826110 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.842237 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lvlr2-config-wsgcd"] Feb 20 07:07:22 crc kubenswrapper[5094]: I0220 07:07:22.935124 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d27ft" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.036642 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") pod \"2538e2cc-781b-4c2a-b993-381e488fd5bb\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.036862 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") pod \"2538e2cc-781b-4c2a-b993-381e488fd5bb\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.037147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") pod \"2538e2cc-781b-4c2a-b993-381e488fd5bb\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.037267 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") pod \"2538e2cc-781b-4c2a-b993-381e488fd5bb\" (UID: \"2538e2cc-781b-4c2a-b993-381e488fd5bb\") " Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.050947 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2538e2cc-781b-4c2a-b993-381e488fd5bb" (UID: "2538e2cc-781b-4c2a-b993-381e488fd5bb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.051061 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5" (OuterVolumeSpecName: "kube-api-access-f2sn5") pod "2538e2cc-781b-4c2a-b993-381e488fd5bb" (UID: "2538e2cc-781b-4c2a-b993-381e488fd5bb"). InnerVolumeSpecName "kube-api-access-f2sn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.069007 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2538e2cc-781b-4c2a-b993-381e488fd5bb" (UID: "2538e2cc-781b-4c2a-b993-381e488fd5bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.095183 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data" (OuterVolumeSpecName: "config-data") pod "2538e2cc-781b-4c2a-b993-381e488fd5bb" (UID: "2538e2cc-781b-4c2a-b993-381e488fd5bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.140595 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.140632 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2sn5\" (UniqueName: \"kubernetes.io/projected/2538e2cc-781b-4c2a-b993-381e488fd5bb-kube-api-access-f2sn5\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.140646 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.140661 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2538e2cc-781b-4c2a-b993-381e488fd5bb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.460819 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-d27ft" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.460895 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-d27ft" event={"ID":"2538e2cc-781b-4c2a-b993-381e488fd5bb","Type":"ContainerDied","Data":"178c70ca4808580e5184d1f4d0b6c895bc6f3ba07b300a63f8cb105cddd95d66"} Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.460974 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="178c70ca4808580e5184d1f4d0b6c895bc6f3ba07b300a63f8cb105cddd95d66" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.859854 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00e11219-ecb3-45ab-8303-265d85ff4c3a" path="/var/lib/kubelet/pods/00e11219-ecb3-45ab-8303-265d85ff4c3a/volumes" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.861724 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" path="/var/lib/kubelet/pods/53d83e89-d39a-4ed6-ab65-02820d089bec/volumes" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.904959 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905345 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905368 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905382 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75a27624-eac7-47c9-9f3b-98604d88fb3a" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905392 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="75a27624-eac7-47c9-9f3b-98604d88fb3a" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905402 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4920eee-8485-4faa-892c-893c6466a90c" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905408 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4920eee-8485-4faa-892c-893c6466a90c" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905431 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317d32d8-9ad2-4bd1-87f4-745e3157c713" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905437 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="317d32d8-9ad2-4bd1-87f4-745e3157c713" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905451 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="init" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905458 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="init" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905470 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00e11219-ecb3-45ab-8303-265d85ff4c3a" containerName="ovn-config" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905477 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="00e11219-ecb3-45ab-8303-265d85ff4c3a" containerName="ovn-config" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905486 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905493 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905508 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d59abb8-e7c7-404f-8f03-13d2167bea54" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905514 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d59abb8-e7c7-404f-8f03-13d2167bea54" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905524 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23a44809-2f91-4dbe-80ed-733390b037d8" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905530 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="23a44809-2f91-4dbe-80ed-733390b037d8" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: E0220 07:07:23.905538 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2538e2cc-781b-4c2a-b993-381e488fd5bb" containerName="glance-db-sync" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905544 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2538e2cc-781b-4c2a-b993-381e488fd5bb" containerName="glance-db-sync" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905687 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905715 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4920eee-8485-4faa-892c-893c6466a90c" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905727 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="75a27624-eac7-47c9-9f3b-98604d88fb3a" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905735 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="53d83e89-d39a-4ed6-ab65-02820d089bec" containerName="dnsmasq-dns" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905744 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2538e2cc-781b-4c2a-b993-381e488fd5bb" containerName="glance-db-sync" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905752 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d59abb8-e7c7-404f-8f03-13d2167bea54" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905760 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="00e11219-ecb3-45ab-8303-265d85ff4c3a" containerName="ovn-config" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905773 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="317d32d8-9ad2-4bd1-87f4-745e3157c713" containerName="mariadb-database-create" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.905786 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="23a44809-2f91-4dbe-80ed-733390b037d8" containerName="mariadb-account-create-update" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.906687 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.925263 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.960956 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961052 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961108 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961161 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961184 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:23 crc kubenswrapper[5094]: I0220 07:07:23.961205 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063176 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063227 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063248 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063275 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.063319 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.064519 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.064677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.064929 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.065098 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.065106 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.093159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") pod \"dnsmasq-dns-68677f88c9-kpbpx\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.227055 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:24 crc kubenswrapper[5094]: I0220 07:07:24.736378 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:24 crc kubenswrapper[5094]: W0220 07:07:24.747886 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod887446b0_f238_4ff4_82dd_a903299a0105.slice/crio-d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f WatchSource:0}: Error finding container d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f: Status 404 returned error can't find the container with id d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.484164 5094 generic.go:334] "Generic (PLEG): container finished" podID="887446b0-f238-4ff4-82dd-a903299a0105" containerID="febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2" exitCode=0 Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.484315 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerDied","Data":"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2"} Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.484794 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerStarted","Data":"d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f"} Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.487238 5094 generic.go:334] "Generic (PLEG): container finished" podID="403a4371-09f4-4206-8d60-5b970d7e4faf" containerID="f18af370a6085a561175edbcf861dea18a9b31989555b80d968ac0b83530959d" exitCode=0 Feb 20 07:07:25 crc kubenswrapper[5094]: I0220 07:07:25.487285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-plbtm" event={"ID":"403a4371-09f4-4206-8d60-5b970d7e4faf","Type":"ContainerDied","Data":"f18af370a6085a561175edbcf861dea18a9b31989555b80d968ac0b83530959d"} Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.499303 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerStarted","Data":"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6"} Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.537582 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" podStartSLOduration=3.537546655 podStartE2EDuration="3.537546655s" podCreationTimestamp="2026-02-20 07:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:26.525665641 +0000 UTC m=+1261.398292352" watchObservedRunningTime="2026-02-20 07:07:26.537546655 +0000 UTC m=+1261.410173366" Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.879065 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.923380 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") pod \"403a4371-09f4-4206-8d60-5b970d7e4faf\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.923583 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") pod \"403a4371-09f4-4206-8d60-5b970d7e4faf\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.923825 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") pod \"403a4371-09f4-4206-8d60-5b970d7e4faf\" (UID: \"403a4371-09f4-4206-8d60-5b970d7e4faf\") " Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.931116 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh" (OuterVolumeSpecName: "kube-api-access-d67lh") pod "403a4371-09f4-4206-8d60-5b970d7e4faf" (UID: "403a4371-09f4-4206-8d60-5b970d7e4faf"). InnerVolumeSpecName "kube-api-access-d67lh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.977039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data" (OuterVolumeSpecName: "config-data") pod "403a4371-09f4-4206-8d60-5b970d7e4faf" (UID: "403a4371-09f4-4206-8d60-5b970d7e4faf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:26 crc kubenswrapper[5094]: I0220 07:07:26.984675 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "403a4371-09f4-4206-8d60-5b970d7e4faf" (UID: "403a4371-09f4-4206-8d60-5b970d7e4faf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.031558 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d67lh\" (UniqueName: \"kubernetes.io/projected/403a4371-09f4-4206-8d60-5b970d7e4faf-kube-api-access-d67lh\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.031605 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.031616 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/403a4371-09f4-4206-8d60-5b970d7e4faf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.513580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-plbtm" event={"ID":"403a4371-09f4-4206-8d60-5b970d7e4faf","Type":"ContainerDied","Data":"73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98"} Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.514220 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73c249f43927a9b87fab7f1f7c42f5fbb83b2f89d2a99cdbf8b2852eeafd7f98" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.514277 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.513662 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-plbtm" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.858430 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.905967 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:27 crc kubenswrapper[5094]: E0220 07:07:27.906405 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="403a4371-09f4-4206-8d60-5b970d7e4faf" containerName="keystone-db-sync" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.906425 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="403a4371-09f4-4206-8d60-5b970d7e4faf" containerName="keystone-db-sync" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.906623 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="403a4371-09f4-4206-8d60-5b970d7e4faf" containerName="keystone-db-sync" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.907525 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.952636 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.953885 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954438 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954584 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954825 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.954963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.955036 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.964887 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qgvjt" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.965110 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.965257 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.965419 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.971173 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 07:07:27 crc kubenswrapper[5094]: I0220 07:07:27.979466 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.000628 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057349 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057614 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057677 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.057842 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058176 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058241 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058274 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058334 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058355 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058673 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.058939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.059728 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.060309 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.060930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.061119 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.114678 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") pod \"dnsmasq-dns-7d67cdfc8f-nngtz\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161325 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161376 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161427 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161471 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161498 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.161546 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.195219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.203627 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.207000 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.220524 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.221175 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.222840 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") pod \"keystone-bootstrap-b78rr\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.233314 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.248057 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.249434 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.257273 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.257536 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.257662 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f4gh4" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.283622 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.285031 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.305722 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.310488 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.311985 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zk895" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.312146 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.333258 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.371695 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.371768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.371817 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.381859 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.382335 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:28 crc kubenswrapper[5094]: I0220 07:07:28.382407 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.382496 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.382528 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.382552 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.382594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.454747 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.456494 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.459832 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.463689 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fq797" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.463926 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484500 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484540 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484561 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484583 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484625 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484650 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.484686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.485450 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.495257 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.497526 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.504307 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.522432 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.523333 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.523927 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.524117 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.527562 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") pod \"neutron-db-sync-dnm22\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.528377 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") pod \"cinder-db-sync-t7hr7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.542534 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.544416 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.557074 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-z4bpk" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.557445 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.585933 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586531 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586554 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586585 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586652 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586681 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586789 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.586830 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.594263 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.594882 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.609975 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.621982 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.627348 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.632854 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.633142 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.678224 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.693175 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.694136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.694356 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.694454 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.694533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695371 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695448 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.695947 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696091 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696263 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696439 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696522 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.696617 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.697321 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.698813 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.703802 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.707829 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.714163 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.717213 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.719131 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.719537 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") pod \"placement-db-sync-jsvf2\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.722223 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") pod \"barbican-db-sync-fvmwf\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.732041 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.745093 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.797060 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jsvf2" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801563 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801589 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801740 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801767 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801788 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.801816 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.802211 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.803833 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.806819 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.807949 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.808043 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.810273 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.810583 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.826587 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") pod \"ceilometer-0\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.885673 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.905353 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.905419 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.905492 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.907819 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.907846 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.907983 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.909747 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.910018 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.910718 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.910727 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.911053 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.925315 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") pod \"dnsmasq-dns-67dccc895-wgf2f\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:28.976934 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.026413 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.180957 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.183872 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.190175 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.190446 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.191433 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.191478 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9q5bq" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.194322 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317345 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317426 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317460 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317525 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317541 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317585 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317607 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.317684 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.358430 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.360434 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.370167 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.370556 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.382534 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422443 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422822 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422924 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.422997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425079 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425246 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425354 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425686 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.425970 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426018 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426080 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426107 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426215 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426783 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.426841 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.431054 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.431722 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.432735 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.443270 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.447318 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.458475 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.511386 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528206 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528321 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528385 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528488 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528549 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.528612 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.529265 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.530103 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.530254 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.533520 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.534141 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.534463 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.536612 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.537432 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: W0220 07:07:29.548301 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1a92386_f07a_4845_9a5d_231a4c498d3f.slice/crio-268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb WatchSource:0}: Error finding container 268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb: Status 404 returned error can't find the container with id 268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.552406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.601910 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.613579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" event={"ID":"e1a92386-f07a-4845-9a5d-231a4c498d3f","Type":"ContainerStarted","Data":"268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb"} Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.613906 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="dnsmasq-dns" containerID="cri-o://60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" gracePeriod=10 Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.686657 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.861878 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.888419 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.906726 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.915105 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:07:29 crc kubenswrapper[5094]: W0220 07:07:29.921405 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffc4926a_ede6_4124_ac91_c9912ffa8a23.slice/crio-003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56 WatchSource:0}: Error finding container 003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56: Status 404 returned error can't find the container with id 003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56 Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.929029 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:07:29 crc kubenswrapper[5094]: I0220 07:07:29.956744 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.030649 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.209561 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.497375 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.543634 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.564386 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.564932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.565088 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.565219 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.565451 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.565619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") pod \"887446b0-f238-4ff4-82dd-a903299a0105\" (UID: \"887446b0-f238-4ff4-82dd-a903299a0105\") " Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.574977 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684" (OuterVolumeSpecName: "kube-api-access-vl684") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "kube-api-access-vl684". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.671350 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl684\" (UniqueName: \"kubernetes.io/projected/887446b0-f238-4ff4-82dd-a903299a0105-kube-api-access-vl684\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.676055 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerStarted","Data":"49eddfb76f5127d9790e86db4511eac3b1b4ddb83b95cfa7b7d41b4e2cbe2a5f"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680339 5094 generic.go:334] "Generic (PLEG): container finished" podID="887446b0-f238-4ff4-82dd-a903299a0105" containerID="60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" exitCode=0 Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680456 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680488 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerDied","Data":"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680529 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68677f88c9-kpbpx" event={"ID":"887446b0-f238-4ff4-82dd-a903299a0105","Type":"ContainerDied","Data":"d86a37ef604c4bf81c2fdfca35e77934bf681730a45dd0766d8017147ff2766f"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.680552 5094 scope.go:117] "RemoveContainer" containerID="60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.686519 5094 generic.go:334] "Generic (PLEG): container finished" podID="e1a92386-f07a-4845-9a5d-231a4c498d3f" containerID="fdd5dbb72ced24f8fe5963a32bf7fcb2da6a991b9087dbbc5d4a24d4daaa0e56" exitCode=0 Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.686597 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" event={"ID":"e1a92386-f07a-4845-9a5d-231a4c498d3f","Type":"ContainerDied","Data":"fdd5dbb72ced24f8fe5963a32bf7fcb2da6a991b9087dbbc5d4a24d4daaa0e56"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.690451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerStarted","Data":"f1fff81d621e7a98189b82c44089d8bbf94b1c86c3b0de4e3b7aae8e0f0b3931"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.692361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerStarted","Data":"db75b8edda289ee932be6a52e71a112b82ffd485c1dcf3a3df8f8b8577f5dd7d"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.694050 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7hr7" event={"ID":"15583b83-ce22-4b0b-9566-0e056b07c0d7","Type":"ContainerStarted","Data":"7bade8250a47c4de537369bc3b59be47a7a82eab789fb96f42d111f476483273"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.696423 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jsvf2" event={"ID":"fd4e0644-4339-45bf-a919-0de0551c5baa","Type":"ContainerStarted","Data":"21f4e9dd4335713cc70e8a920496faa98eb7767c86615d81c2e49ffa01bf7858"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.699832 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"7d9f8b3046d52c477cb1f1e73376c263067dfb9d891c04aea7389d4a8f986dbe"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.706084 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fvmwf" event={"ID":"d6e6aec3-87a9-4f8a-b640-313ab241ec6f","Type":"ContainerStarted","Data":"6e88acd54dbbf562acdafbd54ac1a987deab1124d294a1cf214bd932d6b05497"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.710274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b78rr" event={"ID":"d63a9457-c57a-4979-bd28-ee982250b13c","Type":"ContainerStarted","Data":"1b8f196598eb71bb63b0c9673a673bcdbc971ba9e6d708b6476a724a6e47b813"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.740546 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dnm22" event={"ID":"ffc4926a-ede6-4124-ac91-c9912ffa8a23","Type":"ContainerStarted","Data":"003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56"} Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.748058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.800168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.801541 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config" (OuterVolumeSpecName: "config") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.809571 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.809622 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.809635 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.820443 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.835621 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.837408 5094 scope.go:117] "RemoveContainer" containerID="febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.901132 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.913144 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.915653 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.961032 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "887446b0-f238-4ff4-82dd-a903299a0105" (UID: "887446b0-f238-4ff4-82dd-a903299a0105"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.973955 5094 scope.go:117] "RemoveContainer" containerID="60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" Feb 20 07:07:30 crc kubenswrapper[5094]: E0220 07:07:30.975871 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6\": container with ID starting with 60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6 not found: ID does not exist" containerID="60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.975966 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6"} err="failed to get container status \"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6\": rpc error: code = NotFound desc = could not find container \"60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6\": container with ID starting with 60d68fcdef2499d06a7eba5248abfa54f3596133e4fe35ebcfc6def68e018ca6 not found: ID does not exist" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.975998 5094 scope.go:117] "RemoveContainer" containerID="febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2" Feb 20 07:07:30 crc kubenswrapper[5094]: E0220 07:07:30.991582 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2\": container with ID starting with febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2 not found: ID does not exist" containerID="febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2" Feb 20 07:07:30 crc kubenswrapper[5094]: I0220 07:07:30.991631 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2"} err="failed to get container status \"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2\": rpc error: code = NotFound desc = could not find container \"febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2\": container with ID starting with febef95577b2b18c2f7a29013235d1029fd43ecb02762ac3e41f6e76448932f2 not found: ID does not exist" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.015196 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/887446b0-f238-4ff4-82dd-a903299a0105-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.109588 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.146202 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68677f88c9-kpbpx"] Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.356680 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423312 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423351 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423390 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423414 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423481 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.423551 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") pod \"e1a92386-f07a-4845-9a5d-231a4c498d3f\" (UID: \"e1a92386-f07a-4845-9a5d-231a4c498d3f\") " Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.438315 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl" (OuterVolumeSpecName: "kube-api-access-j5mfl") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "kube-api-access-j5mfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.452677 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config" (OuterVolumeSpecName: "config") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.461600 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.461759 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.464481 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.464642 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1a92386-f07a-4845-9a5d-231a4c498d3f" (UID: "e1a92386-f07a-4845-9a5d-231a4c498d3f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524556 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5mfl\" (UniqueName: \"kubernetes.io/projected/e1a92386-f07a-4845-9a5d-231a4c498d3f-kube-api-access-j5mfl\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524594 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524607 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524621 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524654 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.524732 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1a92386-f07a-4845-9a5d-231a4c498d3f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.775014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b78rr" event={"ID":"d63a9457-c57a-4979-bd28-ee982250b13c","Type":"ContainerStarted","Data":"12ead5ac0eb56e9815a9bc9d890ea3167a7072689f354e4d8ffab5e114525ab7"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.783274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dnm22" event={"ID":"ffc4926a-ede6-4124-ac91-c9912ffa8a23","Type":"ContainerStarted","Data":"bec04cc90bea00c99a6cbeb35313f8c3b75e668698426b61e7cbceb44b686553"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.789240 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerStarted","Data":"8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.793653 5094 generic.go:334] "Generic (PLEG): container finished" podID="53cdb905-b22d-4849-ae24-6baa2838be39" containerID="135ff79dc99efbd620593401c9d7a73c61d546032c68fab2fd094efa64fa4d62" exitCode=0 Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.793768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerDied","Data":"135ff79dc99efbd620593401c9d7a73c61d546032c68fab2fd094efa64fa4d62"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.828364 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b78rr" podStartSLOduration=4.828338176 podStartE2EDuration="4.828338176s" podCreationTimestamp="2026-02-20 07:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:31.800051758 +0000 UTC m=+1266.672678459" watchObservedRunningTime="2026-02-20 07:07:31.828338176 +0000 UTC m=+1266.700964887" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.842158 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.859411 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dnm22" podStartSLOduration=3.859384079 podStartE2EDuration="3.859384079s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:31.849368829 +0000 UTC m=+1266.721995550" watchObservedRunningTime="2026-02-20 07:07:31.859384079 +0000 UTC m=+1266.732010790" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.894754 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="887446b0-f238-4ff4-82dd-a903299a0105" path="/var/lib/kubelet/pods/887446b0-f238-4ff4-82dd-a903299a0105/volumes" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.895579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d67cdfc8f-nngtz" event={"ID":"e1a92386-f07a-4845-9a5d-231a4c498d3f","Type":"ContainerDied","Data":"268dd0a406ca72b786257f7ababdacbdbf60e8cb46f0c9eb76e1118a867dcccb"} Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.895635 5094 scope.go:117] "RemoveContainer" containerID="fdd5dbb72ced24f8fe5963a32bf7fcb2da6a991b9087dbbc5d4a24d4daaa0e56" Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.967695 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:31 crc kubenswrapper[5094]: I0220 07:07:31.975138 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d67cdfc8f-nngtz"] Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.875509 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerStarted","Data":"2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d"} Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.884797 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerStarted","Data":"c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051"} Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.885019 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-log" containerID="cri-o://8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e" gracePeriod=30 Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.885716 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-httpd" containerID="cri-o://c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051" gracePeriod=30 Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.893060 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerStarted","Data":"886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51"} Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.893101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.909528 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.909503159 podStartE2EDuration="4.909503159s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:32.903941205 +0000 UTC m=+1267.776567916" watchObservedRunningTime="2026-02-20 07:07:32.909503159 +0000 UTC m=+1267.782129870" Feb 20 07:07:32 crc kubenswrapper[5094]: I0220 07:07:32.943398 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" podStartSLOduration=4.943373159 podStartE2EDuration="4.943373159s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:32.933661617 +0000 UTC m=+1267.806288328" watchObservedRunningTime="2026-02-20 07:07:32.943373159 +0000 UTC m=+1267.815999870" Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.853904 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a92386-f07a-4845-9a5d-231a4c498d3f" path="/var/lib/kubelet/pods/e1a92386-f07a-4845-9a5d-231a4c498d3f/volumes" Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.921834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerDied","Data":"c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051"} Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.921690 5094 generic.go:334] "Generic (PLEG): container finished" podID="090c8378-96fa-4223-8b6d-b98fa179046a" containerID="c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051" exitCode=0 Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.922022 5094 generic.go:334] "Generic (PLEG): container finished" podID="090c8378-96fa-4223-8b6d-b98fa179046a" containerID="8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e" exitCode=143 Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.922113 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerDied","Data":"8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e"} Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.939732 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerStarted","Data":"70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8"} Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.939963 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-log" containerID="cri-o://2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d" gracePeriod=30 Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.940112 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-httpd" containerID="cri-o://70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8" gracePeriod=30 Feb 20 07:07:33 crc kubenswrapper[5094]: I0220 07:07:33.975768 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.975742975 podStartE2EDuration="5.975742975s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:33.965098439 +0000 UTC m=+1268.837725150" watchObservedRunningTime="2026-02-20 07:07:33.975742975 +0000 UTC m=+1268.848369686" Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.106519 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.106614 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.954586 5094 generic.go:334] "Generic (PLEG): container finished" podID="4cbff870-795c-4622-90e7-e06559b6d884" containerID="70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8" exitCode=0 Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.955092 5094 generic.go:334] "Generic (PLEG): container finished" podID="4cbff870-795c-4622-90e7-e06559b6d884" containerID="2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d" exitCode=143 Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.954669 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerDied","Data":"70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8"} Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.955164 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerDied","Data":"2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d"} Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.957928 5094 generic.go:334] "Generic (PLEG): container finished" podID="d63a9457-c57a-4979-bd28-ee982250b13c" containerID="12ead5ac0eb56e9815a9bc9d890ea3167a7072689f354e4d8ffab5e114525ab7" exitCode=0 Feb 20 07:07:34 crc kubenswrapper[5094]: I0220 07:07:34.957968 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b78rr" event={"ID":"d63a9457-c57a-4979-bd28-ee982250b13c","Type":"ContainerDied","Data":"12ead5ac0eb56e9815a9bc9d890ea3167a7072689f354e4d8ffab5e114525ab7"} Feb 20 07:07:38 crc kubenswrapper[5094]: I0220 07:07:38.910617 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.002854 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b78rr" event={"ID":"d63a9457-c57a-4979-bd28-ee982250b13c","Type":"ContainerDied","Data":"1b8f196598eb71bb63b0c9673a673bcdbc971ba9e6d708b6476a724a6e47b813"} Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.002915 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b8f196598eb71bb63b0c9673a673bcdbc971ba9e6d708b6476a724a6e47b813" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.002928 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b78rr" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.028930 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.068520 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.068670 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.068745 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.068861 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.069024 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.069173 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") pod \"d63a9457-c57a-4979-bd28-ee982250b13c\" (UID: \"d63a9457-c57a-4979-bd28-ee982250b13c\") " Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.080121 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.102434 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.128697 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts" (OuterVolumeSpecName: "scripts") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.132128 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data" (OuterVolumeSpecName: "config-data") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.144156 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b" (OuterVolumeSpecName: "kube-api-access-25g6b") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "kube-api-access-25g6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.150143 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.150410 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" containerID="cri-o://0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6" gracePeriod=10 Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.160394 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d63a9457-c57a-4979-bd28-ee982250b13c" (UID: "d63a9457-c57a-4979-bd28-ee982250b13c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171322 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171364 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171378 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171387 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25g6b\" (UniqueName: \"kubernetes.io/projected/d63a9457-c57a-4979-bd28-ee982250b13c-kube-api-access-25g6b\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171396 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:39 crc kubenswrapper[5094]: I0220 07:07:39.171403 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d63a9457-c57a-4979-bd28-ee982250b13c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.040426 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerID="0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6" exitCode=0 Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.040493 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerDied","Data":"0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6"} Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.080586 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.087078 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b78rr"] Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.187202 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:07:40 crc kubenswrapper[5094]: E0220 07:07:40.187840 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d63a9457-c57a-4979-bd28-ee982250b13c" containerName="keystone-bootstrap" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.187862 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d63a9457-c57a-4979-bd28-ee982250b13c" containerName="keystone-bootstrap" Feb 20 07:07:40 crc kubenswrapper[5094]: E0220 07:07:40.188042 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="dnsmasq-dns" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188053 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="dnsmasq-dns" Feb 20 07:07:40 crc kubenswrapper[5094]: E0220 07:07:40.188094 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188106 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: E0220 07:07:40.188123 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a92386-f07a-4845-9a5d-231a4c498d3f" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188131 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a92386-f07a-4845-9a5d-231a4c498d3f" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188344 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d63a9457-c57a-4979-bd28-ee982250b13c" containerName="keystone-bootstrap" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188364 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="887446b0-f238-4ff4-82dd-a903299a0105" containerName="dnsmasq-dns" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.188374 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a92386-f07a-4845-9a5d-231a4c498d3f" containerName="init" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.192077 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.196882 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.196983 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.196885 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.197377 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.197521 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qgvjt" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.218504 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301677 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301781 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301812 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301885 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.301935 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.403773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.403934 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.403988 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.404014 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.404082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.404105 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.415151 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.415146 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.415241 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.415429 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.416137 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.428122 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") pod \"keystone-bootstrap-xnj8t\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:40 crc kubenswrapper[5094]: I0220 07:07:40.517444 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:07:41 crc kubenswrapper[5094]: I0220 07:07:41.856306 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d63a9457-c57a-4979-bd28-ee982250b13c" path="/var/lib/kubelet/pods/d63a9457-c57a-4979-bd28-ee982250b13c/volumes" Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.962608 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.982596 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995378 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995529 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995642 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995684 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995751 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995870 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995907 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.995961 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996081 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996188 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996240 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996279 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996361 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:42 crc kubenswrapper[5094]: I0220 07:07:42.996435 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.003586 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs" (OuterVolumeSpecName: "logs") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.007353 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.007378 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs" (OuterVolumeSpecName: "logs") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.009554 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.010475 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.028516 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts" (OuterVolumeSpecName: "scripts") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.031181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts" (OuterVolumeSpecName: "scripts") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.035515 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.040242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6" (OuterVolumeSpecName: "kube-api-access-8x4w6") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "kube-api-access-8x4w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.108647 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.122573 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf" (OuterVolumeSpecName: "kube-api-access-zmlgf") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "kube-api-access-zmlgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123089 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123295 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") pod \"4cbff870-795c-4622-90e7-e06559b6d884\" (UID: \"4cbff870-795c-4622-90e7-e06559b6d884\") " Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123449 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") pod \"090c8378-96fa-4223-8b6d-b98fa179046a\" (UID: \"090c8378-96fa-4223-8b6d-b98fa179046a\") " Feb 20 07:07:43 crc kubenswrapper[5094]: W0220 07:07:43.123505 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/4cbff870-795c-4622-90e7-e06559b6d884/volumes/kubernetes.io~projected/kube-api-access-zmlgf Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf" (OuterVolumeSpecName: "kube-api-access-zmlgf") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "kube-api-access-zmlgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: W0220 07:07:43.123737 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/090c8378-96fa-4223-8b6d-b98fa179046a/volumes/kubernetes.io~secret/combined-ca-bundle Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.123756 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124549 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124577 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124592 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124606 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8x4w6\" (UniqueName: \"kubernetes.io/projected/090c8378-96fa-4223-8b6d-b98fa179046a-kube-api-access-8x4w6\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124641 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124656 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124795 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4cbff870-795c-4622-90e7-e06559b6d884-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124813 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124864 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124925 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/090c8378-96fa-4223-8b6d-b98fa179046a-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124976 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmlgf\" (UniqueName: \"kubernetes.io/projected/4cbff870-795c-4622-90e7-e06559b6d884-kube-api-access-zmlgf\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.124992 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.132094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"090c8378-96fa-4223-8b6d-b98fa179046a","Type":"ContainerDied","Data":"f1fff81d621e7a98189b82c44089d8bbf94b1c86c3b0de4e3b7aae8e0f0b3931"} Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.132146 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.132283 5094 scope.go:117] "RemoveContainer" containerID="c31d8324706f8243f00fc950b22649416caea675f5ff2715ed5cde10bb50f051" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.137387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4cbff870-795c-4622-90e7-e06559b6d884","Type":"ContainerDied","Data":"49eddfb76f5127d9790e86db4511eac3b1b4ddb83b95cfa7b7d41b4e2cbe2a5f"} Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.137530 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.143270 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data" (OuterVolumeSpecName: "config-data") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.147984 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4cbff870-795c-4622-90e7-e06559b6d884" (UID: "4cbff870-795c-4622-90e7-e06559b6d884"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.148069 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data" (OuterVolumeSpecName: "config-data") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.155046 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.163735 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.178974 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "090c8378-96fa-4223-8b6d-b98fa179046a" (UID: "090c8378-96fa-4223-8b6d-b98fa179046a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227437 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227478 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227496 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227508 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227519 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/090c8378-96fa-4223-8b6d-b98fa179046a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.227531 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4cbff870-795c-4622-90e7-e06559b6d884-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.477908 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.487498 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.500861 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.509955 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.528935 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: E0220 07:07:43.529429 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.529448 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: E0220 07:07:43.529466 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.529474 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: E0220 07:07:43.529487 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.529495 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: E0220 07:07:43.529514 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.529520 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.530177 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.530196 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cbff870-795c-4622-90e7-e06559b6d884" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.530214 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-log" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.530225 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" containerName="glance-httpd" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.531265 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.534276 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9q5bq" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.535426 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.535537 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.535612 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.552933 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.555022 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.557281 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.557438 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.561835 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.570505 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635328 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635378 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635411 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635531 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635560 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635616 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.635642 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.736928 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.736985 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737024 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737045 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737064 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737087 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737105 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737162 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737223 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737249 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737596 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737682 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.737769 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.739920 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740016 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740079 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740111 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.740180 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.741359 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.743286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.750023 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.757118 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.758671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.763672 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.799111 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.841364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.841434 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.841980 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842027 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842051 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842089 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842107 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.842135 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.843853 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.843925 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.844331 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.848868 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.853384 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.855995 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.856143 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.867515 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.872559 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="090c8378-96fa-4223-8b6d-b98fa179046a" path="/var/lib/kubelet/pods/090c8378-96fa-4223-8b6d-b98fa179046a/volumes" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.875610 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cbff870-795c-4622-90e7-e06559b6d884" path="/var/lib/kubelet/pods/4cbff870-795c-4622-90e7-e06559b6d884/volumes" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.892210 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:07:43 crc kubenswrapper[5094]: I0220 07:07:43.898450 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:07:44 crc kubenswrapper[5094]: I0220 07:07:44.201850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:07:48 crc kubenswrapper[5094]: I0220 07:07:48.799695 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: connect: connection refused" Feb 20 07:07:49 crc kubenswrapper[5094]: I0220 07:07:49.201568 5094 generic.go:334] "Generic (PLEG): container finished" podID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" containerID="bec04cc90bea00c99a6cbeb35313f8c3b75e668698426b61e7cbceb44b686553" exitCode=0 Feb 20 07:07:49 crc kubenswrapper[5094]: I0220 07:07:49.201613 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dnm22" event={"ID":"ffc4926a-ede6-4124-ac91-c9912ffa8a23","Type":"ContainerDied","Data":"bec04cc90bea00c99a6cbeb35313f8c3b75e668698426b61e7cbceb44b686553"} Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.517915 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.659056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") pod \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.659452 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") pod \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.659556 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") pod \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\" (UID: \"ffc4926a-ede6-4124-ac91-c9912ffa8a23\") " Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.674244 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz" (OuterVolumeSpecName: "kube-api-access-ncgbz") pod "ffc4926a-ede6-4124-ac91-c9912ffa8a23" (UID: "ffc4926a-ede6-4124-ac91-c9912ffa8a23"). InnerVolumeSpecName "kube-api-access-ncgbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.697266 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config" (OuterVolumeSpecName: "config") pod "ffc4926a-ede6-4124-ac91-c9912ffa8a23" (UID: "ffc4926a-ede6-4124-ac91-c9912ffa8a23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.697829 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffc4926a-ede6-4124-ac91-c9912ffa8a23" (UID: "ffc4926a-ede6-4124-ac91-c9912ffa8a23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.762983 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncgbz\" (UniqueName: \"kubernetes.io/projected/ffc4926a-ede6-4124-ac91-c9912ffa8a23-kube-api-access-ncgbz\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.763033 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:52 crc kubenswrapper[5094]: I0220 07:07:52.763051 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffc4926a-ede6-4124-ac91-c9912ffa8a23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.241786 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dnm22" event={"ID":"ffc4926a-ede6-4124-ac91-c9912ffa8a23","Type":"ContainerDied","Data":"003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56"} Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.242250 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="003ab580a28b4b49925e83b0991967f99365515d76c15aa90bdb08c5b298df56" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.241884 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dnm22" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.818136 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:07:53 crc kubenswrapper[5094]: E0220 07:07:53.818558 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" containerName="neutron-db-sync" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.818571 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" containerName="neutron-db-sync" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.818789 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" containerName="neutron-db-sync" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.819938 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.856901 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.875815 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.877417 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.884936 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.885232 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.885926 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-zk895" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.886069 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.895028 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990839 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990864 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990888 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990937 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.990959 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991217 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991401 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991497 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991533 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:53 crc kubenswrapper[5094]: I0220 07:07:53.991716 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094177 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094298 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094336 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094360 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094422 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094474 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094500 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094530 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.094558 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.095576 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.096161 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.097746 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.098010 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.098177 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.105633 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.105626 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.108685 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.115491 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.116622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") pod \"dnsmasq-dns-db5c97f8f-4sj4d\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.118515 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") pod \"neutron-5ddb8575b6-4wznv\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.158413 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:54 crc kubenswrapper[5094]: E0220 07:07:54.195162 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 20 07:07:54 crc kubenswrapper[5094]: E0220 07:07:54.195464 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qf24d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-t7hr7_openstack(15583b83-ce22-4b0b-9566-0e056b07c0d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 07:07:54 crc kubenswrapper[5094]: E0220 07:07:54.197363 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-t7hr7" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.211529 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.236960 5094 scope.go:117] "RemoveContainer" containerID="8bc694cf60dc2751dc7ddd6a90ad6a89e8b2d9feaeaec62df46315e77617467e" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.262101 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" event={"ID":"aa0f7da7-a9bd-4b03-b256-d05ba9323e70","Type":"ContainerDied","Data":"a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa"} Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.262158 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a20fdfeb6eba82be36ac56d6e09083509aa1e18f2c3332e9e2fa4dc3127f38aa" Feb 20 07:07:54 crc kubenswrapper[5094]: E0220 07:07:54.266775 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-t7hr7" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.360964 5094 scope.go:117] "RemoveContainer" containerID="70fcfaee70dbdb9d905be4dee063c5e65bbf885dee16b38909df107b4aa714d8" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.376931 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503540 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503689 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503746 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503782 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.503795 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") pod \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\" (UID: \"aa0f7da7-a9bd-4b03-b256-d05ba9323e70\") " Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.533439 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5" (OuterVolumeSpecName: "kube-api-access-jzck5") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "kube-api-access-jzck5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.574893 5094 scope.go:117] "RemoveContainer" containerID="2d6d1c9bef646cdf5d3cdcc1b6a5be793e730ee75c92f1c11a51822f455ce37d" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.575008 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.606307 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.606344 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzck5\" (UniqueName: \"kubernetes.io/projected/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-kube-api-access-jzck5\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.677440 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.708906 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.759030 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.762916 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.791666 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.810828 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.810882 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.816313 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config" (OuterVolumeSpecName: "config") pod "aa0f7da7-a9bd-4b03-b256-d05ba9323e70" (UID: "aa0f7da7-a9bd-4b03-b256-d05ba9323e70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.914963 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa0f7da7-a9bd-4b03-b256-d05ba9323e70-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:07:54 crc kubenswrapper[5094]: I0220 07:07:54.916937 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.042277 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.175386 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.332033 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerStarted","Data":"f4c855da6dfadfbfaa6485b438038ef55afdb9b012e53b5119257649b545af32"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.365667 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerStarted","Data":"d752ec9568b97c7a9a1e0ea7c10ce0973a08508f8beb84b53fb4fc5635c2706f"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.385549 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerStarted","Data":"c37272ac6f9e924740c6d7aa103c2e64f6efab2b196866681821b669409f2ee4"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.395134 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jsvf2" event={"ID":"fd4e0644-4339-45bf-a919-0de0551c5baa","Type":"ContainerStarted","Data":"8c0a9ed7e7a708519d24ca670cb89abe814680e00b43aecd0756535bb04fcacc"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.401940 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.418154 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xnj8t" event={"ID":"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b","Type":"ContainerStarted","Data":"d8a84b3a4264179ed1969ed49bb2b66b1b9094412f8430af00f5618a35872170"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.418218 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xnj8t" event={"ID":"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b","Type":"ContainerStarted","Data":"3471a54fd91f31962c620ac3ac75855c348f402dfa6489d764abc8274de2fc01"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.428163 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fvmwf" event={"ID":"d6e6aec3-87a9-4f8a-b640-313ab241ec6f","Type":"ContainerStarted","Data":"8e52dfc26f0d8eee3b5547850bbe5713b909b7c78a2c9a2bf1d9cded5250f6d8"} Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.432370 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jsvf2" podStartSLOduration=3.179852414 podStartE2EDuration="27.432347303s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="2026-02-20 07:07:29.949929809 +0000 UTC m=+1264.822556520" lastFinishedPulling="2026-02-20 07:07:54.202424698 +0000 UTC m=+1289.075051409" observedRunningTime="2026-02-20 07:07:55.419552867 +0000 UTC m=+1290.292179578" watchObservedRunningTime="2026-02-20 07:07:55.432347303 +0000 UTC m=+1290.304974014" Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.434825 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.452534 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xnj8t" podStartSLOduration=15.452513347 podStartE2EDuration="15.452513347s" podCreationTimestamp="2026-02-20 07:07:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:55.447934177 +0000 UTC m=+1290.320560888" watchObservedRunningTime="2026-02-20 07:07:55.452513347 +0000 UTC m=+1290.325140058" Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.488608 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fvmwf" podStartSLOduration=3.217929856 podStartE2EDuration="27.48858445s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="2026-02-20 07:07:29.943872004 +0000 UTC m=+1264.816498715" lastFinishedPulling="2026-02-20 07:07:54.214526598 +0000 UTC m=+1289.087153309" observedRunningTime="2026-02-20 07:07:55.478052118 +0000 UTC m=+1290.350678829" watchObservedRunningTime="2026-02-20 07:07:55.48858445 +0000 UTC m=+1290.361211161" Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.514314 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.521584 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-768666cd57-7ddwb"] Feb 20 07:07:55 crc kubenswrapper[5094]: I0220 07:07:55.862265 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" path="/var/lib/kubelet/pods/aa0f7da7-a9bd-4b03-b256-d05ba9323e70/volumes" Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.159981 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.478094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerStarted","Data":"7216118bc6764d988378bcc95f70afbc24e44597d2724806d33cbd64bb7f1c0b"} Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.495511 5094 generic.go:334] "Generic (PLEG): container finished" podID="2c274cd0-4938-48fa-8534-409a3070299f" containerID="ce1978c29ea807736776e1aab75d72153c9ee3dd68aa8d95e4d850a505683bff" exitCode=0 Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.495596 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerDied","Data":"ce1978c29ea807736776e1aab75d72153c9ee3dd68aa8d95e4d850a505683bff"} Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.503476 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerStarted","Data":"3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7"} Feb 20 07:07:56 crc kubenswrapper[5094]: I0220 07:07:56.512768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerStarted","Data":"d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.349825 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:07:57 crc kubenswrapper[5094]: E0220 07:07:57.351104 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="init" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.351120 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="init" Feb 20 07:07:57 crc kubenswrapper[5094]: E0220 07:07:57.351132 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.351138 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.351330 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.360593 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.363475 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.367195 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.367560 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484277 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484339 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484400 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484489 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484529 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.484580 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.522986 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerStarted","Data":"00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.525076 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerStarted","Data":"763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.534065 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerStarted","Data":"f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.545003 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerStarted","Data":"419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.545115 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerStarted","Data":"eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931"} Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.547047 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.578525 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.578496202 podStartE2EDuration="14.578496202s" podCreationTimestamp="2026-02-20 07:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:57.561238709 +0000 UTC m=+1292.433865440" watchObservedRunningTime="2026-02-20 07:07:57.578496202 +0000 UTC m=+1292.451122913" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.587465 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.587533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.590067 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.593376 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.595585 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.595625 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.595811 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.598084 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.603108 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.605291 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.610411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.612114 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.612087887 podStartE2EDuration="14.612087887s" podCreationTimestamp="2026-02-20 07:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:57.586383362 +0000 UTC m=+1292.459010073" watchObservedRunningTime="2026-02-20 07:07:57.612087887 +0000 UTC m=+1292.484714598" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.613203 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.613474 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.623558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") pod \"neutron-7d8645fb77-xprwl\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.631454 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5ddb8575b6-4wznv" podStartSLOduration=4.63142755 podStartE2EDuration="4.63142755s" podCreationTimestamp="2026-02-20 07:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:57.627528727 +0000 UTC m=+1292.500155448" watchObservedRunningTime="2026-02-20 07:07:57.63142755 +0000 UTC m=+1292.504054261" Feb 20 07:07:57 crc kubenswrapper[5094]: I0220 07:07:57.736071 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.357166 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.562361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d"} Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.570547 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerStarted","Data":"b9098b5dbcb409320185fc3b697229991f7ab044a2834551fda65cf47f38a5d4"} Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.570887 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.605401 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" podStartSLOduration=5.605373466 podStartE2EDuration="5.605373466s" podCreationTimestamp="2026-02-20 07:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:58.588799439 +0000 UTC m=+1293.461426150" watchObservedRunningTime="2026-02-20 07:07:58.605373466 +0000 UTC m=+1293.478000187" Feb 20 07:07:58 crc kubenswrapper[5094]: I0220 07:07:58.803651 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-768666cd57-7ddwb" podUID="aa0f7da7-a9bd-4b03-b256-d05ba9323e70" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.123:5353: i/o timeout" Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.584215 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerStarted","Data":"be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5"} Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.584669 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.584683 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerStarted","Data":"a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017"} Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.586170 5094 generic.go:334] "Generic (PLEG): container finished" podID="fd4e0644-4339-45bf-a919-0de0551c5baa" containerID="8c0a9ed7e7a708519d24ca670cb89abe814680e00b43aecd0756535bb04fcacc" exitCode=0 Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.586209 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jsvf2" event={"ID":"fd4e0644-4339-45bf-a919-0de0551c5baa","Type":"ContainerDied","Data":"8c0a9ed7e7a708519d24ca670cb89abe814680e00b43aecd0756535bb04fcacc"} Feb 20 07:07:59 crc kubenswrapper[5094]: I0220 07:07:59.604293 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7d8645fb77-xprwl" podStartSLOduration=2.6042665879999998 podStartE2EDuration="2.604266588s" podCreationTimestamp="2026-02-20 07:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:07:59.601905401 +0000 UTC m=+1294.474532112" watchObservedRunningTime="2026-02-20 07:07:59.604266588 +0000 UTC m=+1294.476893299" Feb 20 07:08:00 crc kubenswrapper[5094]: I0220 07:08:00.601141 5094 generic.go:334] "Generic (PLEG): container finished" podID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" containerID="d8a84b3a4264179ed1969ed49bb2b66b1b9094412f8430af00f5618a35872170" exitCode=0 Feb 20 07:08:00 crc kubenswrapper[5094]: I0220 07:08:00.601234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xnj8t" event={"ID":"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b","Type":"ContainerDied","Data":"d8a84b3a4264179ed1969ed49bb2b66b1b9094412f8430af00f5618a35872170"} Feb 20 07:08:00 crc kubenswrapper[5094]: I0220 07:08:00.604801 5094 generic.go:334] "Generic (PLEG): container finished" podID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" containerID="8e52dfc26f0d8eee3b5547850bbe5713b909b7c78a2c9a2bf1d9cded5250f6d8" exitCode=0 Feb 20 07:08:00 crc kubenswrapper[5094]: I0220 07:08:00.604984 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fvmwf" event={"ID":"d6e6aec3-87a9-4f8a-b640-313ab241ec6f","Type":"ContainerDied","Data":"8e52dfc26f0d8eee3b5547850bbe5713b909b7c78a2c9a2bf1d9cded5250f6d8"} Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.357541 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.366288 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jsvf2" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.371592 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531328 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531400 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531485 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531537 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") pod \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531624 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531735 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531811 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") pod \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531885 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.531993 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") pod \"fd4e0644-4339-45bf-a919-0de0551c5baa\" (UID: \"fd4e0644-4339-45bf-a919-0de0551c5baa\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.532042 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") pod \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\" (UID: \"d6e6aec3-87a9-4f8a-b640-313ab241ec6f\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.532063 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.532111 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") pod \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\" (UID: \"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b\") " Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.533228 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs" (OuterVolumeSpecName: "logs") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.538162 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz" (OuterVolumeSpecName: "kube-api-access-fqrpz") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "kube-api-access-fqrpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.540157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b" (OuterVolumeSpecName: "kube-api-access-bft6b") pod "d6e6aec3-87a9-4f8a-b640-313ab241ec6f" (UID: "d6e6aec3-87a9-4f8a-b640-313ab241ec6f"). InnerVolumeSpecName "kube-api-access-bft6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.540560 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts" (OuterVolumeSpecName: "scripts") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.540592 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.540664 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d6e6aec3-87a9-4f8a-b640-313ab241ec6f" (UID: "d6e6aec3-87a9-4f8a-b640-313ab241ec6f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.541021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts" (OuterVolumeSpecName: "scripts") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.541031 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs" (OuterVolumeSpecName: "kube-api-access-vgbzs") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "kube-api-access-vgbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.555036 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.571114 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data" (OuterVolumeSpecName: "config-data") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.575981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.579117 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data" (OuterVolumeSpecName: "config-data") pod "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" (UID: "0dd5a2da-23e3-4bbd-a211-b3f527e4e28b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.584657 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd4e0644-4339-45bf-a919-0de0551c5baa" (UID: "fd4e0644-4339-45bf-a919-0de0551c5baa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.606845 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6e6aec3-87a9-4f8a-b640-313ab241ec6f" (UID: "d6e6aec3-87a9-4f8a-b640-313ab241ec6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.633541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xnj8t" event={"ID":"0dd5a2da-23e3-4bbd-a211-b3f527e4e28b","Type":"ContainerDied","Data":"3471a54fd91f31962c620ac3ac75855c348f402dfa6489d764abc8274de2fc01"} Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.633595 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3471a54fd91f31962c620ac3ac75855c348f402dfa6489d764abc8274de2fc01" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.634757 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xnj8t" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.637928 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.637962 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.637975 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.637988 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bft6b\" (UniqueName: \"kubernetes.io/projected/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-kube-api-access-bft6b\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638003 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638015 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638027 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638038 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqrpz\" (UniqueName: \"kubernetes.io/projected/fd4e0644-4339-45bf-a919-0de0551c5baa-kube-api-access-fqrpz\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638051 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638062 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd4e0644-4339-45bf-a919-0de0551c5baa-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638076 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638091 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d6e6aec3-87a9-4f8a-b640-313ab241ec6f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638103 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgbzs\" (UniqueName: \"kubernetes.io/projected/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b-kube-api-access-vgbzs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638114 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd4e0644-4339-45bf-a919-0de0551c5baa-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638779 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fvmwf" event={"ID":"d6e6aec3-87a9-4f8a-b640-313ab241ec6f","Type":"ContainerDied","Data":"6e88acd54dbbf562acdafbd54ac1a987deab1124d294a1cf214bd932d6b05497"} Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638829 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e88acd54dbbf562acdafbd54ac1a987deab1124d294a1cf214bd932d6b05497" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.638903 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fvmwf" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.648932 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jsvf2" event={"ID":"fd4e0644-4339-45bf-a919-0de0551c5baa","Type":"ContainerDied","Data":"21f4e9dd4335713cc70e8a920496faa98eb7767c86615d81c2e49ffa01bf7858"} Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.648970 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21f4e9dd4335713cc70e8a920496faa98eb7767c86615d81c2e49ffa01bf7858" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.649043 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jsvf2" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.728219 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:08:02 crc kubenswrapper[5094]: E0220 07:08:02.728772 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" containerName="keystone-bootstrap" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.728790 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" containerName="keystone-bootstrap" Feb 20 07:08:02 crc kubenswrapper[5094]: E0220 07:08:02.728832 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" containerName="barbican-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.728844 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" containerName="barbican-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: E0220 07:08:02.728854 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd4e0644-4339-45bf-a919-0de0551c5baa" containerName="placement-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.728860 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd4e0644-4339-45bf-a919-0de0551c5baa" containerName="placement-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.729058 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" containerName="barbican-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.729105 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd4e0644-4339-45bf-a919-0de0551c5baa" containerName="placement-db-sync" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.729130 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" containerName="keystone-bootstrap" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.729905 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.732506 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.734689 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.735146 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-qgvjt" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.735330 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.735478 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.735722 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.738318 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.846962 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847030 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847200 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847244 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847285 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847350 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847397 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.847425 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949420 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949498 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949529 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949580 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949684 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.949728 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.971490 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.971762 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.976552 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.976933 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.977348 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.979653 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.982483 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:02 crc kubenswrapper[5094]: I0220 07:08:02.995385 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") pod \"keystone-55468cd684-wv6dn\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.025725 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.029546 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.063089 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.072859 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.073069 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-z4bpk" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.073152 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.140320 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156499 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156572 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156650 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156688 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.156778 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.201799 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.220927 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.221075 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.234632 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258055 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258147 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258204 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258225 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.258282 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.260534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.272685 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.296838 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.298394 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.298568 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.306583 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.311173 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.315103 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" containerID="cri-o://00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988" gracePeriod=10 Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.322676 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.330286 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") pod \"barbican-worker-7bf7749bf9-ggfvf\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.345151 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363180 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363270 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363313 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363422 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363458 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.363671 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472298 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472356 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472388 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472427 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472464 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472489 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472547 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472577 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.472631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.476817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.500542 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.507503 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.525813 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.561917 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") pod \"barbican-keystone-listener-548b4f9548-p65sn\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.562473 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575101 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575206 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575232 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575280 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.575304 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.576896 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.600865 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.602190 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.602539 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.621079 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.623238 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.670754 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.672372 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.694139 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.713071 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") pod \"barbican-worker-7df9984bd9-6txsf\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.737292 5094 generic.go:334] "Generic (PLEG): container finished" podID="2c274cd0-4938-48fa-8534-409a3070299f" containerID="00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988" exitCode=0 Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.737384 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerDied","Data":"00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988"} Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.739818 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.752670 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786091 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786614 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786718 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786758 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786806 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786893 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786920 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.786941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.787991 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.798104 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.873801 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e"} Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.873913 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.893341 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.893386 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.895722 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906349 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906476 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906569 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906629 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906825 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906858 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906924 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.906991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907009 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907029 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907046 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907137 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907166 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907233 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.907310 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.908292 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.917899 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.919219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.920353 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.921406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.921864 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.924855 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.957727 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.974819 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.975367 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") pod \"dnsmasq-dns-9d49dd75f-pdb4g\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.976063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") pod \"barbican-keystone-listener-7c9ffdc684-v56nc\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.992467 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:08:03 crc kubenswrapper[5094]: I0220 07:08:03.996780 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.006902 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.007351 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.008205 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.009184 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.012538 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fq797" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.016628 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.016743 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.016829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.016917 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.017059 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.019226 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.021960 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.026872 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.029691 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.032656 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.040725 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.056476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.057081 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.079857 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") pod \"barbican-api-9d777b794-txl9q\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.106998 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.107087 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.107152 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.108327 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.108387 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7" gracePeriod=600 Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124669 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124791 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124822 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124903 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.124980 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.149601 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.170163 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.202600 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.203011 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.233525 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.233601 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.233624 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.246342 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.251270 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.251423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.251466 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.251555 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.258610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.261059 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.274555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.274680 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.287999 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.301920 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") pod \"placement-5b8f9d577d-pgn2k\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.313867 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.347168 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.404632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.497979 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.550368 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.744714 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.840939 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:08:04 crc kubenswrapper[5094]: W0220 07:08:04.860566 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod795db1e7_56e9_4ff2_91f1_b3589603d82c.slice/crio-fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027 WatchSource:0}: Error finding container fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027: Status 404 returned error can't find the container with id fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027 Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.870218 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.870288 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.870333 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.870515 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.871106 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.871246 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") pod \"2c274cd0-4938-48fa-8534-409a3070299f\" (UID: \"2c274cd0-4938-48fa-8534-409a3070299f\") " Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.876759 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.887953 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn" (OuterVolumeSpecName: "kube-api-access-b6zcn") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "kube-api-access-b6zcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.896094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" event={"ID":"2c274cd0-4938-48fa-8534-409a3070299f","Type":"ContainerDied","Data":"f4c855da6dfadfbfaa6485b438038ef55afdb9b012e53b5119257649b545af32"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.896155 5094 scope.go:117] "RemoveContainer" containerID="00d31f7caea8f28d87993a4c9ffad14c255acc53e525b3a8aedf619b6fe7c988" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.896361 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-db5c97f8f-4sj4d" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.915364 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468cd684-wv6dn" event={"ID":"732b4015-53b2-4422-b7d1-12b65f6e0c92","Type":"ContainerStarted","Data":"0e6de3c16ee5f3004f5d74169204f09eb1abb0191c70b76d1a09ff44b2f07e6d"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.981492 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7" exitCode=0 Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.982383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.984827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.985795 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.985882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.985908 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerStarted","Data":"775f84fe6aa5df6496627179d444588e580571d5d2f0a7733afea33c0965498e"} Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.986232 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.988049 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.988085 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.988268 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6zcn\" (UniqueName: \"kubernetes.io/projected/2c274cd0-4938-48fa-8534-409a3070299f-kube-api-access-b6zcn\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.988634 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:04 crc kubenswrapper[5094]: I0220 07:08:04.989104 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.003269 5094 scope.go:117] "RemoveContainer" containerID="ce1978c29ea807736776e1aab75d72153c9ee3dd68aa8d95e4d850a505683bff" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.026697 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.026822 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.031594 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.091613 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.163449 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.353825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config" (OuterVolumeSpecName: "config") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.396846 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.401723 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.401749 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.424235 5094 scope.go:117] "RemoveContainer" containerID="2fef18a95cc33ac92b0e7f9a9dd8176f054fb1a2db35c994afe1e4273592eb9d" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.564546 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2c274cd0-4938-48fa-8534-409a3070299f" (UID: "2c274cd0-4938-48fa-8534-409a3070299f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.632781 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2c274cd0-4938-48fa-8534-409a3070299f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.891806 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:08:05 crc kubenswrapper[5094]: I0220 07:08:05.891848 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-db5c97f8f-4sj4d"] Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.106597 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerStarted","Data":"0dddba45fa8488a7532914b19dd0a9c232300eec4679520cb1b28149a6920d2e"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.141872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerStarted","Data":"5c342dad28df34f1d8d92f5a04877af4cb07a57675de3adb965ed98cfe8eaa77"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.169484 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerStarted","Data":"9f163abc1efd183ebd2809a660db1c44ccc0e92e53e74d9ce4dfa48299f86759"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.182666 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerStarted","Data":"b019c482612055cc0918048f8a12a69d96710169b8244b6ca81050099107cecc"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.189913 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468cd684-wv6dn" event={"ID":"732b4015-53b2-4422-b7d1-12b65f6e0c92","Type":"ContainerStarted","Data":"d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.190365 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.191566 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerStarted","Data":"62bcfccc8f6311f78f2ee50f1468178552942ffa51a8e7c08d763e6569fd3de7"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.214828 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerStarted","Data":"fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027"} Feb 20 07:08:06 crc kubenswrapper[5094]: I0220 07:08:06.262644 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55468cd684-wv6dn" podStartSLOduration=4.26262226 podStartE2EDuration="4.26262226s" podCreationTimestamp="2026-02-20 07:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:06.241637317 +0000 UTC m=+1301.114264028" watchObservedRunningTime="2026-02-20 07:08:06.26262226 +0000 UTC m=+1301.135248961" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.251845 5094 generic.go:334] "Generic (PLEG): container finished" podID="b2a1712c-5268-4203-bab0-c427e96b217b" containerID="548f803cb834d46aa79471e7e46c2cf5bba78c70a36499567ada0307a507be4e" exitCode=0 Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.252582 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerDied","Data":"548f803cb834d46aa79471e7e46c2cf5bba78c70a36499567ada0307a507be4e"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.252629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerStarted","Data":"a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.254240 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.257896 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerStarted","Data":"d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.257955 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerStarted","Data":"91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.259044 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.259073 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263112 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerStarted","Data":"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263184 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerStarted","Data":"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb"} Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263430 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263450 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263428 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263609 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263973 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.263990 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.276270 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" podStartSLOduration=4.276250356 podStartE2EDuration="4.276250356s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:07.273254104 +0000 UTC m=+1302.145880815" watchObservedRunningTime="2026-02-20 07:08:07.276250356 +0000 UTC m=+1302.148877067" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.348542 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5b8f9d577d-pgn2k" podStartSLOduration=4.3485177870000005 podStartE2EDuration="4.348517787s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:07.316182212 +0000 UTC m=+1302.188808923" watchObservedRunningTime="2026-02-20 07:08:07.348517787 +0000 UTC m=+1302.221144498" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.353742 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-9d777b794-txl9q" podStartSLOduration=4.351694483 podStartE2EDuration="4.351694483s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:07.35158937 +0000 UTC m=+1302.224216071" watchObservedRunningTime="2026-02-20 07:08:07.351694483 +0000 UTC m=+1302.224321194" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.571659 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:08:07 crc kubenswrapper[5094]: E0220 07:08:07.572057 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="init" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.572075 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="init" Feb 20 07:08:07 crc kubenswrapper[5094]: E0220 07:08:07.572102 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.572109 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.572314 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c274cd0-4938-48fa-8534-409a3070299f" containerName="dnsmasq-dns" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.573256 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.575727 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.575729 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.598759 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666083 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666179 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666204 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666247 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666316 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666342 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.666362 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768518 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768607 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768638 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768687 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768769 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768797 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.768825 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.769295 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.781744 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.786363 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.800439 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.801436 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.807482 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.808178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") pod \"barbican-api-79d9bcd9d4-9jrtt\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.880179 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c274cd0-4938-48fa-8534-409a3070299f" path="/var/lib/kubelet/pods/2c274cd0-4938-48fa-8534-409a3070299f/volumes" Feb 20 07:08:07 crc kubenswrapper[5094]: I0220 07:08:07.896003 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.304286 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.306838 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.349831 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.350961 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.351117 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:08:08 crc kubenswrapper[5094]: I0220 07:08:08.357406 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 07:08:09 crc kubenswrapper[5094]: I0220 07:08:09.333657 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.330825 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerStarted","Data":"b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.331656 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerStarted","Data":"c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.335748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerStarted","Data":"d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.335792 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerStarted","Data":"7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.349177 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7hr7" event={"ID":"15583b83-ce22-4b0b-9566-0e056b07c0d7","Type":"ContainerStarted","Data":"2e821859400e46f67b2a4c22980c8831875df8a723826e86fdac889d47a73a82"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.371678 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" podStartSLOduration=3.705876657 podStartE2EDuration="7.371651189s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="2026-02-20 07:08:05.146845379 +0000 UTC m=+1300.019472090" lastFinishedPulling="2026-02-20 07:08:08.812619911 +0000 UTC m=+1303.685246622" observedRunningTime="2026-02-20 07:08:10.355188114 +0000 UTC m=+1305.227814825" watchObservedRunningTime="2026-02-20 07:08:10.371651189 +0000 UTC m=+1305.244277890" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.373142 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerStarted","Data":"31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.373235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerStarted","Data":"23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.385584 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerStarted","Data":"2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.385612 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerStarted","Data":"7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.390330 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-t7hr7" podStartSLOduration=3.50325182 podStartE2EDuration="42.390305136s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="2026-02-20 07:07:29.943488145 +0000 UTC m=+1264.816114856" lastFinishedPulling="2026-02-20 07:08:08.830541461 +0000 UTC m=+1303.703168172" observedRunningTime="2026-02-20 07:08:10.378076383 +0000 UTC m=+1305.250703094" watchObservedRunningTime="2026-02-20 07:08:10.390305136 +0000 UTC m=+1305.262931847" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.413654 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.418430 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7df9984bd9-6txsf" podStartSLOduration=3.520126607 podStartE2EDuration="7.418405809s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="2026-02-20 07:08:04.90432345 +0000 UTC m=+1299.776950171" lastFinishedPulling="2026-02-20 07:08:08.802602672 +0000 UTC m=+1303.675229373" observedRunningTime="2026-02-20 07:08:10.416904493 +0000 UTC m=+1305.289531204" watchObservedRunningTime="2026-02-20 07:08:10.418405809 +0000 UTC m=+1305.291032520" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.419190 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerStarted","Data":"459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.419256 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerStarted","Data":"106993ad972a41a70b0e11997dac58bd4e6ab90384569399045d1bedeaba95e2"} Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.420262 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.420287 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.485573 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" podStartSLOduration=3.55608003 podStartE2EDuration="7.485548877s" podCreationTimestamp="2026-02-20 07:08:03 +0000 UTC" firstStartedPulling="2026-02-20 07:08:04.882833286 +0000 UTC m=+1299.755459997" lastFinishedPulling="2026-02-20 07:08:08.812302133 +0000 UTC m=+1303.684928844" observedRunningTime="2026-02-20 07:08:10.446086831 +0000 UTC m=+1305.318713532" watchObservedRunningTime="2026-02-20 07:08:10.485548877 +0000 UTC m=+1305.358175588" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.493266 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.505303 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" podStartSLOduration=4.2396663310000005 podStartE2EDuration="8.505274339s" podCreationTimestamp="2026-02-20 07:08:02 +0000 UTC" firstStartedPulling="2026-02-20 07:08:04.546738177 +0000 UTC m=+1299.419364888" lastFinishedPulling="2026-02-20 07:08:08.812346185 +0000 UTC m=+1303.684972896" observedRunningTime="2026-02-20 07:08:10.472313879 +0000 UTC m=+1305.344940590" watchObservedRunningTime="2026-02-20 07:08:10.505274339 +0000 UTC m=+1305.377901050" Feb 20 07:08:10 crc kubenswrapper[5094]: I0220 07:08:10.523159 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podStartSLOduration=3.523137046 podStartE2EDuration="3.523137046s" podCreationTimestamp="2026-02-20 07:08:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:10.50908631 +0000 UTC m=+1305.381713021" watchObservedRunningTime="2026-02-20 07:08:10.523137046 +0000 UTC m=+1305.395763757" Feb 20 07:08:11 crc kubenswrapper[5094]: I0220 07:08:11.432535 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerStarted","Data":"a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b"} Feb 20 07:08:12 crc kubenswrapper[5094]: I0220 07:08:12.447559 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker-log" containerID="cri-o://7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a" gracePeriod=30 Feb 20 07:08:12 crc kubenswrapper[5094]: I0220 07:08:12.447904 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker" containerID="cri-o://2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28" gracePeriod=30 Feb 20 07:08:12 crc kubenswrapper[5094]: I0220 07:08:12.448318 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener-log" containerID="cri-o://23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d" gracePeriod=30 Feb 20 07:08:12 crc kubenswrapper[5094]: I0220 07:08:12.449071 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener" containerID="cri-o://31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223" gracePeriod=30 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.461434 5094 generic.go:334] "Generic (PLEG): container finished" podID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerID="31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223" exitCode=0 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.461478 5094 generic.go:334] "Generic (PLEG): container finished" podID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerID="23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d" exitCode=143 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.461572 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerDied","Data":"31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223"} Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.461651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerDied","Data":"23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d"} Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.465143 5094 generic.go:334] "Generic (PLEG): container finished" podID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerID="2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28" exitCode=0 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.465211 5094 generic.go:334] "Generic (PLEG): container finished" podID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerID="7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a" exitCode=143 Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.465232 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerDied","Data":"2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28"} Feb 20 07:08:13 crc kubenswrapper[5094]: I0220 07:08:13.465312 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerDied","Data":"7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a"} Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.059011 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.155360 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.156733 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="dnsmasq-dns" containerID="cri-o://886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51" gracePeriod=10 Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.485640 5094 generic.go:334] "Generic (PLEG): container finished" podID="53cdb905-b22d-4849-ae24-6baa2838be39" containerID="886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51" exitCode=0 Feb 20 07:08:14 crc kubenswrapper[5094]: I0220 07:08:14.485690 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerDied","Data":"886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51"} Feb 20 07:08:15 crc kubenswrapper[5094]: I0220 07:08:15.496388 5094 generic.go:334] "Generic (PLEG): container finished" podID="15583b83-ce22-4b0b-9566-0e056b07c0d7" containerID="2e821859400e46f67b2a4c22980c8831875df8a723826e86fdac889d47a73a82" exitCode=0 Feb 20 07:08:15 crc kubenswrapper[5094]: I0220 07:08:15.496522 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7hr7" event={"ID":"15583b83-ce22-4b0b-9566-0e056b07c0d7","Type":"ContainerDied","Data":"2e821859400e46f67b2a4c22980c8831875df8a723826e86fdac889d47a73a82"} Feb 20 07:08:15 crc kubenswrapper[5094]: I0220 07:08:15.685336 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:15 crc kubenswrapper[5094]: I0220 07:08:15.773902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.521191 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-t7hr7" event={"ID":"15583b83-ce22-4b0b-9566-0e056b07c0d7","Type":"ContainerDied","Data":"7bade8250a47c4de537369bc3b59be47a7a82eab789fb96f42d111f476483273"} Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.521737 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bade8250a47c4de537369bc3b59be47a7a82eab789fb96f42d111f476483273" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.528815 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.651464 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.725683 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.725842 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.725884 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.725956 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.726008 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.726066 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") pod \"15583b83-ce22-4b0b-9566-0e056b07c0d7\" (UID: \"15583b83-ce22-4b0b-9566-0e056b07c0d7\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.731171 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.744033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts" (OuterVolumeSpecName: "scripts") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.753841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d" (OuterVolumeSpecName: "kube-api-access-qf24d") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "kube-api-access-qf24d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.762925 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.771922 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.811751 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data" (OuterVolumeSpecName: "config-data") pod "15583b83-ce22-4b0b-9566-0e056b07c0d7" (UID: "15583b83-ce22-4b0b-9566-0e056b07c0d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.827715 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.827863 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.827923 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.827948 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828099 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828561 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf24d\" (UniqueName: \"kubernetes.io/projected/15583b83-ce22-4b0b-9566-0e056b07c0d7-kube-api-access-qf24d\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828589 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/15583b83-ce22-4b0b-9566-0e056b07c0d7-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828601 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828612 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828621 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.828630 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/15583b83-ce22-4b0b-9566-0e056b07c0d7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.863034 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25" (OuterVolumeSpecName: "kube-api-access-xpr25") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "kube-api-access-xpr25". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.877758 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.884218 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config" (OuterVolumeSpecName: "config") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.925432 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.928393 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.929009 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.929561 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") pod \"53cdb905-b22d-4849-ae24-6baa2838be39\" (UID: \"53cdb905-b22d-4849-ae24-6baa2838be39\") " Feb 20 07:08:17 crc kubenswrapper[5094]: W0220 07:08:17.929918 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/53cdb905-b22d-4849-ae24-6baa2838be39/volumes/kubernetes.io~configmap/dns-svc Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.929949 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930167 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930187 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930199 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930208 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpr25\" (UniqueName: \"kubernetes.io/projected/53cdb905-b22d-4849-ae24-6baa2838be39-kube-api-access-xpr25\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.930217 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.944734 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:17 crc kubenswrapper[5094]: I0220 07:08:17.991294 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53cdb905-b22d-4849-ae24-6baa2838be39" (UID: "53cdb905-b22d-4849-ae24-6baa2838be39"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031066 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031212 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031250 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031279 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031443 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") pod \"038d4354-a929-4f2d-9633-4cd7dadd4523\" (UID: \"038d4354-a929-4f2d-9633-4cd7dadd4523\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.031899 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53cdb905-b22d-4849-ae24-6baa2838be39-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.032042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs" (OuterVolumeSpecName: "logs") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.037597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.037990 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc" (OuterVolumeSpecName: "kube-api-access-mknrc") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "kube-api-access-mknrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.080012 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data" (OuterVolumeSpecName: "config-data") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.080596 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "038d4354-a929-4f2d-9633-4cd7dadd4523" (UID: "038d4354-a929-4f2d-9633-4cd7dadd4523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.132725 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.133050 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.133181 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.133385 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.133588 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") pod \"795db1e7-56e9-4ff2-91f1-b3589603d82c\" (UID: \"795db1e7-56e9-4ff2-91f1-b3589603d82c\") " Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134182 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/038d4354-a929-4f2d-9633-4cd7dadd4523-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134282 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134361 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134417 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/038d4354-a929-4f2d-9633-4cd7dadd4523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134472 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mknrc\" (UniqueName: \"kubernetes.io/projected/038d4354-a929-4f2d-9633-4cd7dadd4523-kube-api-access-mknrc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.134168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs" (OuterVolumeSpecName: "logs") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.136161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m" (OuterVolumeSpecName: "kube-api-access-65k5m") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "kube-api-access-65k5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.136883 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.166516 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.195797 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data" (OuterVolumeSpecName: "config-data") pod "795db1e7-56e9-4ff2-91f1-b3589603d82c" (UID: "795db1e7-56e9-4ff2-91f1-b3589603d82c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236288 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236330 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236343 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65k5m\" (UniqueName: \"kubernetes.io/projected/795db1e7-56e9-4ff2-91f1-b3589603d82c-kube-api-access-65k5m\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236353 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/795db1e7-56e9-4ff2-91f1-b3589603d82c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.236362 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/795db1e7-56e9-4ff2-91f1-b3589603d82c-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.530814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" event={"ID":"038d4354-a929-4f2d-9633-4cd7dadd4523","Type":"ContainerDied","Data":"775f84fe6aa5df6496627179d444588e580571d5d2f0a7733afea33c0965498e"} Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.530891 5094 scope.go:117] "RemoveContainer" containerID="2667db03c56b602dde1dd590504590facc01acd7cc000fb0e9dbfb81bb5c4b28" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.530902 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7bf7749bf9-ggfvf" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.532717 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" event={"ID":"53cdb905-b22d-4849-ae24-6baa2838be39","Type":"ContainerDied","Data":"db75b8edda289ee932be6a52e71a112b82ffd485c1dcf3a3df8f8b8577f5dd7d"} Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.532823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dccc895-wgf2f" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.542624 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerStarted","Data":"515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516"} Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.542840 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-central-agent" containerID="cri-o://154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61" gracePeriod=30 Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.543149 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.543454 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="proxy-httpd" containerID="cri-o://515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516" gracePeriod=30 Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.543529 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="sg-core" containerID="cri-o://735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e" gracePeriod=30 Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.543571 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-notification-agent" containerID="cri-o://8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d" gracePeriod=30 Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.568168 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-t7hr7" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.569641 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.572941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-548b4f9548-p65sn" event={"ID":"795db1e7-56e9-4ff2-91f1-b3589603d82c","Type":"ContainerDied","Data":"fb04a41995c63d48690668a46b722c642ec1c8a54ddc3f182df807e2e50bd027"} Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.581461 5094 scope.go:117] "RemoveContainer" containerID="7333f8096289116ee06bfc72c67bb4306feb2e24c57c1db54024c78d8103de1a" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.601774 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.060106877 podStartE2EDuration="50.601747823s" podCreationTimestamp="2026-02-20 07:07:28 +0000 UTC" firstStartedPulling="2026-02-20 07:07:29.872673959 +0000 UTC m=+1264.745300670" lastFinishedPulling="2026-02-20 07:08:17.414314905 +0000 UTC m=+1312.286941616" observedRunningTime="2026-02-20 07:08:18.577588445 +0000 UTC m=+1313.450215156" watchObservedRunningTime="2026-02-20 07:08:18.601747823 +0000 UTC m=+1313.474374534" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.634211 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.642746 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67dccc895-wgf2f"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.652793 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.657694 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7bf7749bf9-ggfvf"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.672786 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.682506 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-548b4f9548-p65sn"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.790282 5094 scope.go:117] "RemoveContainer" containerID="886db898799d64a40ec0fd6c2e2b8f27c2c4edd60adf4a15ed59da9eda2fda51" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860273 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860732 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860746 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860777 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="init" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860783 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="init" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860794 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860803 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker-log" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860817 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" containerName="cinder-db-sync" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860824 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" containerName="cinder-db-sync" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860837 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="dnsmasq-dns" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860842 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="dnsmasq-dns" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860921 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860927 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker" Feb 20 07:08:18 crc kubenswrapper[5094]: E0220 07:08:18.860936 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.860943 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861104 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861122 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861141 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" containerName="barbican-keystone-listener-log" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861155 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" containerName="cinder-db-sync" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861164 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" containerName="barbican-worker" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.861175 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" containerName="dnsmasq-dns" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.866181 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.871248 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.871410 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f4gh4" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.871574 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.871666 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.879440 5094 scope.go:117] "RemoveContainer" containerID="135ff79dc99efbd620593401c9d7a73c61d546032c68fab2fd094efa64fa4d62" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.897377 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.953826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954573 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954652 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:18 crc kubenswrapper[5094]: I0220 07:08:18.954876 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.023518 5094 scope.go:117] "RemoveContainer" containerID="31f1d9012990fa973db844a448a0432ed8205633a414c007cfd82b7a9d651223" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057207 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057544 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057637 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.057966 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.058067 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.058591 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.061944 5094 scope.go:117] "RemoveContainer" containerID="23d5d2a1c804a074977e389a7dd43487a8931e8cf7c459c46b6f0f8448361c8d" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.071326 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.102394 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.103500 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.112902 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.119104 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") pod \"cinder-scheduler-0\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.176601 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.178782 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.195633 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.223766 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.370829 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376043 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376137 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376288 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376485 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.376544 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.389468 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.395990 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.436856 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.478822 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.478886 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.478925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.478974 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.479053 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.479090 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480065 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480089 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480194 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480809 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.480895 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.518185 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") pod \"dnsmasq-dns-6c8dc7b4d9-xbwsw\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.531382 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582213 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582275 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582354 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582515 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582536 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582555 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.582594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.624291 5094 generic.go:334] "Generic (PLEG): container finished" podID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerID="515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516" exitCode=0 Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.642206 5094 generic.go:334] "Generic (PLEG): container finished" podID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerID="735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e" exitCode=2 Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.642242 5094 generic.go:334] "Generic (PLEG): container finished" podID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerID="154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61" exitCode=0 Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.625899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516"} Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.642383 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e"} Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.642433 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61"} Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.687890 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.687951 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.687973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.688030 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.688142 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.688167 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.688189 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.690439 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.690814 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.696186 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.698228 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.698523 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.709313 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.714213 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") pod \"cinder-api-0\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.778672 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.883101 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="038d4354-a929-4f2d-9633-4cd7dadd4523" path="/var/lib/kubelet/pods/038d4354-a929-4f2d-9633-4cd7dadd4523/volumes" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.883922 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53cdb905-b22d-4849-ae24-6baa2838be39" path="/var/lib/kubelet/pods/53cdb905-b22d-4849-ae24-6baa2838be39/volumes" Feb 20 07:08:19 crc kubenswrapper[5094]: I0220 07:08:19.885458 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795db1e7-56e9-4ff2-91f1-b3589603d82c" path="/var/lib/kubelet/pods/795db1e7-56e9-4ff2-91f1-b3589603d82c/volumes" Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.020532 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.119301 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.194923 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.359836 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.390323 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.477975 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.478282 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" containerID="cri-o://da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" gracePeriod=30 Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.478555 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" containerID="cri-o://3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" gracePeriod=30 Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.495019 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": EOF" Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.747483 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerID="da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" exitCode=143 Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.747987 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerDied","Data":"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42"} Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.755318 5094 generic.go:334] "Generic (PLEG): container finished" podID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerID="e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723" exitCode=0 Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.755362 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerDied","Data":"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723"} Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.755381 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerStarted","Data":"db6c4cfd73d84c6bf37834db171aab681839cbd0872d2e8b1d00c5c8feb0f4da"} Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.762989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerStarted","Data":"7d1b6fe880fd5fac2cb8fe26f239b1778509510802a76237301a5a28526a7e35"} Feb 20 07:08:20 crc kubenswrapper[5094]: I0220 07:08:20.769867 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerStarted","Data":"a5e9daf344d7df70576896055f545ff598209ddc3efe1ce29774c938244e44a7"} Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.499925 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.784335 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerStarted","Data":"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6"} Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.784850 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.788351 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerStarted","Data":"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc"} Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.790444 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerStarted","Data":"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639"} Feb 20 07:08:21 crc kubenswrapper[5094]: I0220 07:08:21.816488 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" podStartSLOduration=2.816464873 podStartE2EDuration="2.816464873s" podCreationTimestamp="2026-02-20 07:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:21.810400278 +0000 UTC m=+1316.683026989" watchObservedRunningTime="2026-02-20 07:08:21.816464873 +0000 UTC m=+1316.689091584" Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.840775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerStarted","Data":"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9"} Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.870286 5094 generic.go:334] "Generic (PLEG): container finished" podID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerID="8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d" exitCode=0 Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.870432 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d"} Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.884787 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api-log" containerID="cri-o://151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" gracePeriod=30 Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.884918 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerStarted","Data":"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68"} Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.885346 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.885965 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api" containerID="cri-o://0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" gracePeriod=30 Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.904422 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.176236839 podStartE2EDuration="4.904391398s" podCreationTimestamp="2026-02-20 07:08:18 +0000 UTC" firstStartedPulling="2026-02-20 07:08:20.040151752 +0000 UTC m=+1314.912778463" lastFinishedPulling="2026-02-20 07:08:20.768306311 +0000 UTC m=+1315.640933022" observedRunningTime="2026-02-20 07:08:22.885575598 +0000 UTC m=+1317.758202309" watchObservedRunningTime="2026-02-20 07:08:22.904391398 +0000 UTC m=+1317.777018099" Feb 20 07:08:22 crc kubenswrapper[5094]: I0220 07:08:22.930327 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.9303035790000003 podStartE2EDuration="3.930303579s" podCreationTimestamp="2026-02-20 07:08:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:22.922972294 +0000 UTC m=+1317.795598995" watchObservedRunningTime="2026-02-20 07:08:22.930303579 +0000 UTC m=+1317.802930290" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.224098 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.399171 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.400963 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401221 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401332 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401384 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401447 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") pod \"19d2d34d-f935-40e4-a27a-a382c7634da2\" (UID: \"19d2d34d-f935-40e4-a27a-a382c7634da2\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401857 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.401924 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.409953 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49" (OuterVolumeSpecName: "kube-api-access-jgl49") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "kube-api-access-jgl49". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.413183 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.413230 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgl49\" (UniqueName: \"kubernetes.io/projected/19d2d34d-f935-40e4-a27a-a382c7634da2-kube-api-access-jgl49\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.413245 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19d2d34d-f935-40e4-a27a-a382c7634da2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.430730 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts" (OuterVolumeSpecName: "scripts") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.455811 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.498118 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.515952 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.515986 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.515997 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.520420 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data" (OuterVolumeSpecName: "config-data") pod "19d2d34d-f935-40e4-a27a-a382c7634da2" (UID: "19d2d34d-f935-40e4-a27a-a382c7634da2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.606182 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617359 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617419 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617531 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617680 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617724 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.617810 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.618359 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19d2d34d-f935-40e4-a27a-a382c7634da2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.618422 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.618463 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs" (OuterVolumeSpecName: "logs") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.623572 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts" (OuterVolumeSpecName: "scripts") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.623852 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d" (OuterVolumeSpecName: "kube-api-access-f825d") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "kube-api-access-f825d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.640900 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.657241 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.719112 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data" (OuterVolumeSpecName: "config-data") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.720455 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") pod \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\" (UID: \"26e7c194-b2d4-4578-90c0-d0f141b96bdd\") " Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721140 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721178 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721193 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/26e7c194-b2d4-4578-90c0-d0f141b96bdd-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721212 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f825d\" (UniqueName: \"kubernetes.io/projected/26e7c194-b2d4-4578-90c0-d0f141b96bdd-kube-api-access-f825d\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721228 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721240 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/26e7c194-b2d4-4578-90c0-d0f141b96bdd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: W0220 07:08:23.721384 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/26e7c194-b2d4-4578-90c0-d0f141b96bdd/volumes/kubernetes.io~secret/config-data Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.721412 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data" (OuterVolumeSpecName: "config-data") pod "26e7c194-b2d4-4578-90c0-d0f141b96bdd" (UID: "26e7c194-b2d4-4578-90c0-d0f141b96bdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.822081 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26e7c194-b2d4-4578-90c0-d0f141b96bdd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898025 5094 generic.go:334] "Generic (PLEG): container finished" podID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" exitCode=0 Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898080 5094 generic.go:334] "Generic (PLEG): container finished" podID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" exitCode=143 Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898235 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898371 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerDied","Data":"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68"} Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898432 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerDied","Data":"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639"} Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898454 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"26e7c194-b2d4-4578-90c0-d0f141b96bdd","Type":"ContainerDied","Data":"7d1b6fe880fd5fac2cb8fe26f239b1778509510802a76237301a5a28526a7e35"} Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.898486 5094 scope.go:117] "RemoveContainer" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.905403 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.906019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19d2d34d-f935-40e4-a27a-a382c7634da2","Type":"ContainerDied","Data":"7d9f8b3046d52c477cb1f1e73376c263067dfb9d891c04aea7389d4a8f986dbe"} Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.930012 5094 scope.go:117] "RemoveContainer" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.945659 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.956777 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.980210 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.992634 5094 scope.go:117] "RemoveContainer" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" Feb 20 07:08:23 crc kubenswrapper[5094]: E0220 07:08:23.993312 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": container with ID starting with 0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68 not found: ID does not exist" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993345 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68"} err="failed to get container status \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": rpc error: code = NotFound desc = could not find container \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": container with ID starting with 0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68 not found: ID does not exist" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993368 5094 scope.go:117] "RemoveContainer" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" Feb 20 07:08:23 crc kubenswrapper[5094]: E0220 07:08:23.993664 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": container with ID starting with 151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639 not found: ID does not exist" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993683 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639"} err="failed to get container status \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": rpc error: code = NotFound desc = could not find container \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": container with ID starting with 151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639 not found: ID does not exist" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993695 5094 scope.go:117] "RemoveContainer" containerID="0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993930 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68"} err="failed to get container status \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": rpc error: code = NotFound desc = could not find container \"0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68\": container with ID starting with 0d099694ccd0e8bc3e756e0e2562f59756f1986414106043fd40158e539bfc68 not found: ID does not exist" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.993947 5094 scope.go:117] "RemoveContainer" containerID="151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.994510 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639"} err="failed to get container status \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": rpc error: code = NotFound desc = could not find container \"151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639\": container with ID starting with 151c15464dd4acc5eaeebb5192ced31371bdb90c04ab2251f9ac235bcbdf3639 not found: ID does not exist" Feb 20 07:08:23 crc kubenswrapper[5094]: I0220 07:08:23.994531 5094 scope.go:117] "RemoveContainer" containerID="515e3e41bcc23544014d7b617f381b5649aaa881e54eee2c7379b43ae15d6516" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.003491 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004040 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-notification-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004053 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-notification-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004068 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004077 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004100 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="proxy-httpd" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004106 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="proxy-httpd" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004118 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="sg-core" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004125 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="sg-core" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004149 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api-log" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004155 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api-log" Feb 20 07:08:24 crc kubenswrapper[5094]: E0220 07:08:24.004165 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-central-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004170 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-central-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004347 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="sg-core" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004362 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-central-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004372 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="ceilometer-notification-agent" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004381 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" containerName="proxy-httpd" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004392 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api-log" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.004406 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" containerName="cinder-api" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.005434 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.013643 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.015233 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.015449 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.018736 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.027039 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.037966 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.041833 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.042380 5094 scope.go:117] "RemoveContainer" containerID="735b44943bdf895ffeb305a8a9b07202be3d3f21805babf681c05f7b0d65ad7e" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.044628 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.050868 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.059352 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.090132 5094 scope.go:117] "RemoveContainer" containerID="8fbd0e1a5aa799bf4144c98e297964162b94c7e1f11e3c3574fd101cc616268d" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.116244 5094 scope.go:117] "RemoveContainer" containerID="154c237a8cda216b92583d85eef501f14b59d72a35964b0b4be6a9a7f8415f61" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130026 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130087 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130126 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130295 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130332 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130418 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.130438 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.224917 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.229090 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231508 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231628 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231755 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231846 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.231945 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.232346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.232725 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.232940 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233119 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233262 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233383 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233506 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233599 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233712 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.233809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.237886 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.238803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.239126 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.240448 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.240524 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.246855 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.257296 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") pod \"cinder-api-0\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.335485 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.335743 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.335796 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.335967 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.336035 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.336423 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.336585 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.336721 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.337132 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.337255 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.339452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.342943 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.349560 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.350144 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.358782 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") pod \"ceilometer-0\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.362961 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.532463 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.533249 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7d8645fb77-xprwl" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-api" containerID="cri-o://a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017" gracePeriod=30 Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.533457 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7d8645fb77-xprwl" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" containerID="cri-o://be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5" gracePeriod=30 Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.577755 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.584631 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.624011 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.660932 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661052 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661094 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661149 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661206 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.661452 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763628 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763802 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763834 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763876 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763911 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763966 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.763993 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.771876 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.774776 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.776574 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.777689 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.778964 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.779615 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.783624 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") pod \"neutron-54bd68f77-fkqmr\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.905006 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7d8645fb77-xprwl" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9696/\": read tcp 10.217.0.2:41686->10.217.0.150:9696: read: connection reset by peer" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.925109 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:56714->10.217.0.157:9311: read: connection reset by peer" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.925138 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-9d777b794-txl9q" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:56716->10.217.0.157:9311: read: connection reset by peer" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.933069 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:24 crc kubenswrapper[5094]: I0220 07:08:24.947767 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:08:25 crc kubenswrapper[5094]: W0220 07:08:25.012750 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f8cb333_2939_4404_b242_67bcf4e6875b.slice/crio-32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623 WatchSource:0}: Error finding container 32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623: Status 404 returned error can't find the container with id 32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623 Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.035093 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.691149 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:08:25 crc kubenswrapper[5094]: W0220 07:08:25.700690 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod530069d2_7146_46eb_9c88_056cc8a583b2.slice/crio-bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368 WatchSource:0}: Error finding container bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368: Status 404 returned error can't find the container with id bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368 Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.858933 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d2d34d-f935-40e4-a27a-a382c7634da2" path="/var/lib/kubelet/pods/19d2d34d-f935-40e4-a27a-a382c7634da2/volumes" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.860685 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e7c194-b2d4-4578-90c0-d0f141b96bdd" path="/var/lib/kubelet/pods/26e7c194-b2d4-4578-90c0-d0f141b96bdd/volumes" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.931597 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939063 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerID="3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" exitCode=0 Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939188 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-9d777b794-txl9q" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939862 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerDied","Data":"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939893 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-9d777b794-txl9q" event={"ID":"e9b56f38-2467-4518-bdc3-d6ee665987da","Type":"ContainerDied","Data":"62bcfccc8f6311f78f2ee50f1468178552942ffa51a8e7c08d763e6569fd3de7"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.939916 5094 scope.go:117] "RemoveContainer" containerID="3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.942057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"45ba2822d7e80855ff79a5771c243c313af7e75204cbacc92b81dca39bd3e6ea"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.946235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerStarted","Data":"32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.949979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerStarted","Data":"bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368"} Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.966831 5094 generic.go:334] "Generic (PLEG): container finished" podID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerID="be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5" exitCode=0 Feb 20 07:08:25 crc kubenswrapper[5094]: I0220 07:08:25.966879 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerDied","Data":"be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.000305 5094 scope.go:117] "RemoveContainer" containerID="da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.026879 5094 scope.go:117] "RemoveContainer" containerID="3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" Feb 20 07:08:26 crc kubenswrapper[5094]: E0220 07:08:26.029140 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb\": container with ID starting with 3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb not found: ID does not exist" containerID="3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.029218 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb"} err="failed to get container status \"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb\": rpc error: code = NotFound desc = could not find container \"3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb\": container with ID starting with 3416645d606b6c11c1a4b3c83d2ebc2d75fd169ec7df3d3e76572e9a7ca20cfb not found: ID does not exist" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.029253 5094 scope.go:117] "RemoveContainer" containerID="da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" Feb 20 07:08:26 crc kubenswrapper[5094]: E0220 07:08:26.033064 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42\": container with ID starting with da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42 not found: ID does not exist" containerID="da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.033139 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42"} err="failed to get container status \"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42\": rpc error: code = NotFound desc = could not find container \"da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42\": container with ID starting with da39043890404f802fa5323fb43efc928dbb515543671fe7e4daa4998c4f6b42 not found: ID does not exist" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107108 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107678 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107901 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107945 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.107977 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") pod \"e9b56f38-2467-4518-bdc3-d6ee665987da\" (UID: \"e9b56f38-2467-4518-bdc3-d6ee665987da\") " Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.108678 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs" (OuterVolumeSpecName: "logs") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.117636 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.117671 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp" (OuterVolumeSpecName: "kube-api-access-twrlp") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "kube-api-access-twrlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.154681 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.184270 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data" (OuterVolumeSpecName: "config-data") pod "e9b56f38-2467-4518-bdc3-d6ee665987da" (UID: "e9b56f38-2467-4518-bdc3-d6ee665987da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210661 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9b56f38-2467-4518-bdc3-d6ee665987da-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210694 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210718 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210731 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twrlp\" (UniqueName: \"kubernetes.io/projected/e9b56f38-2467-4518-bdc3-d6ee665987da-kube-api-access-twrlp\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.210742 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9b56f38-2467-4518-bdc3-d6ee665987da-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.296021 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.305202 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-9d777b794-txl9q"] Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.981034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerStarted","Data":"5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.981514 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerStarted","Data":"2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.981948 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.987782 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.987807 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.989222 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerStarted","Data":"d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.989251 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerStarted","Data":"e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32"} Feb 20 07:08:26 crc kubenswrapper[5094]: I0220 07:08:26.989612 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 07:08:27 crc kubenswrapper[5094]: I0220 07:08:27.003559 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54bd68f77-fkqmr" podStartSLOduration=3.00354013 podStartE2EDuration="3.00354013s" podCreationTimestamp="2026-02-20 07:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:27.002223088 +0000 UTC m=+1321.874849809" watchObservedRunningTime="2026-02-20 07:08:27.00354013 +0000 UTC m=+1321.876166841" Feb 20 07:08:27 crc kubenswrapper[5094]: I0220 07:08:27.034604 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.034576273 podStartE2EDuration="4.034576273s" podCreationTimestamp="2026-02-20 07:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:27.023651511 +0000 UTC m=+1321.896278222" watchObservedRunningTime="2026-02-20 07:08:27.034576273 +0000 UTC m=+1321.907202984" Feb 20 07:08:27 crc kubenswrapper[5094]: I0220 07:08:27.737842 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-7d8645fb77-xprwl" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.150:9696/\": dial tcp 10.217.0.150:9696: connect: connection refused" Feb 20 07:08:27 crc kubenswrapper[5094]: I0220 07:08:27.854141 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" path="/var/lib/kubelet/pods/e9b56f38-2467-4518-bdc3-d6ee665987da/volumes" Feb 20 07:08:28 crc kubenswrapper[5094]: I0220 07:08:28.001831 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266"} Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.046089 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerStarted","Data":"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120"} Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.047507 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.095831 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.577898866 podStartE2EDuration="6.095806668s" podCreationTimestamp="2026-02-20 07:08:23 +0000 UTC" firstStartedPulling="2026-02-20 07:08:25.071611071 +0000 UTC m=+1319.944237782" lastFinishedPulling="2026-02-20 07:08:28.589518843 +0000 UTC m=+1323.462145584" observedRunningTime="2026-02-20 07:08:29.08880324 +0000 UTC m=+1323.961429971" watchObservedRunningTime="2026-02-20 07:08:29.095806668 +0000 UTC m=+1323.968433389" Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.534186 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.536344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.669537 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.670321 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="dnsmasq-dns" containerID="cri-o://a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299" gracePeriod=10 Feb 20 07:08:29 crc kubenswrapper[5094]: I0220 07:08:29.709777 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.062992 5094 generic.go:334] "Generic (PLEG): container finished" podID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerID="a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017" exitCode=0 Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.063097 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerDied","Data":"a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017"} Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.064842 5094 generic.go:334] "Generic (PLEG): container finished" podID="b2a1712c-5268-4203-bab0-c427e96b217b" containerID="a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299" exitCode=0 Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.065176 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="cinder-scheduler" containerID="cri-o://0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" gracePeriod=30 Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.065634 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerDied","Data":"a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299"} Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.067228 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="probe" containerID="cri-o://67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" gracePeriod=30 Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.336136 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.343107 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.518780 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.518912 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.518969 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519072 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519120 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519197 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519237 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519288 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519432 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519463 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519487 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") pod \"b2a1712c-5268-4203-bab0-c427e96b217b\" (UID: \"b2a1712c-5268-4203-bab0-c427e96b217b\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.519523 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") pod \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\" (UID: \"6240f946-dbc4-4fdb-b831-23e76bfe2ebc\") " Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.528057 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch" (OuterVolumeSpecName: "kube-api-access-7krch") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "kube-api-access-7krch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.528594 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx" (OuterVolumeSpecName: "kube-api-access-x72mx") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "kube-api-access-x72mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.559221 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.587589 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.593078 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config" (OuterVolumeSpecName: "config") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.606604 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.609527 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.615856 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623636 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7krch\" (UniqueName: \"kubernetes.io/projected/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-kube-api-access-7krch\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623667 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623679 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623690 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623735 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623745 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x72mx\" (UniqueName: \"kubernetes.io/projected/b2a1712c-5268-4203-bab0-c427e96b217b-kube-api-access-x72mx\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623755 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.623765 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.624893 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.627574 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.632903 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b2a1712c-5268-4203-bab0-c427e96b217b" (UID: "b2a1712c-5268-4203-bab0-c427e96b217b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.634119 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config" (OuterVolumeSpecName: "config") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.637671 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6240f946-dbc4-4fdb-b831-23e76bfe2ebc" (UID: "6240f946-dbc4-4fdb-b831-23e76bfe2ebc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726631 5094 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726677 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726690 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6240f946-dbc4-4fdb-b831-23e76bfe2ebc-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726722 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:30 crc kubenswrapper[5094]: I0220 07:08:30.726735 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b2a1712c-5268-4203-bab0-c427e96b217b-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.079950 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7d8645fb77-xprwl" event={"ID":"6240f946-dbc4-4fdb-b831-23e76bfe2ebc","Type":"ContainerDied","Data":"b9098b5dbcb409320185fc3b697229991f7ab044a2834551fda65cf47f38a5d4"} Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.080044 5094 scope.go:117] "RemoveContainer" containerID="be6f73969b02eaf2a0704385ffcaff6d176cc3adc9262bf72bf4d87acc597ac5" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.080282 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7d8645fb77-xprwl" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.085519 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" event={"ID":"b2a1712c-5268-4203-bab0-c427e96b217b","Type":"ContainerDied","Data":"0dddba45fa8488a7532914b19dd0a9c232300eec4679520cb1b28149a6920d2e"} Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.085547 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9d49dd75f-pdb4g" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.087836 5094 generic.go:334] "Generic (PLEG): container finished" podID="71648c9f-0170-413c-9f26-d169c9933469" containerID="67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" exitCode=0 Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.087914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerDied","Data":"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9"} Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.104540 5094 scope.go:117] "RemoveContainer" containerID="a55515cfe2e873dc49fe2e0829229193d635d2147a43cdfaca8db2114d963017" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.134945 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.150933 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7d8645fb77-xprwl"] Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.152545 5094 scope.go:117] "RemoveContainer" containerID="a7f1752e88d647cbe7b50383d081ac516e62abd971b333187ad49b37e7117299" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.163059 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.173470 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9d49dd75f-pdb4g"] Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.191849 5094 scope.go:117] "RemoveContainer" containerID="548f803cb834d46aa79471e7e46c2cf5bba78c70a36499567ada0307a507be4e" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.859402 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" path="/var/lib/kubelet/pods/6240f946-dbc4-4fdb-b831-23e76bfe2ebc/volumes" Feb 20 07:08:31 crc kubenswrapper[5094]: I0220 07:08:31.861724 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" path="/var/lib/kubelet/pods/b2a1712c-5268-4203-bab0-c427e96b217b/volumes" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.637864 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800301 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800427 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800479 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800620 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800650 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") pod \"71648c9f-0170-413c-9f26-d169c9933469\" (UID: \"71648c9f-0170-413c-9f26-d169c9933469\") " Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.800874 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.802927 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/71648c9f-0170-413c-9f26-d169c9933469-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.809772 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts" (OuterVolumeSpecName: "scripts") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.809793 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr" (OuterVolumeSpecName: "kube-api-access-25pbr") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "kube-api-access-25pbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.820880 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.864218 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.909109 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25pbr\" (UniqueName: \"kubernetes.io/projected/71648c9f-0170-413c-9f26-d169c9933469-kube-api-access-25pbr\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.909152 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.909167 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.909181 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:33 crc kubenswrapper[5094]: I0220 07:08:33.950803 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data" (OuterVolumeSpecName: "config-data") pod "71648c9f-0170-413c-9f26-d169c9933469" (UID: "71648c9f-0170-413c-9f26-d169c9933469"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.011700 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71648c9f-0170-413c-9f26-d169c9933469-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128108 5094 generic.go:334] "Generic (PLEG): container finished" podID="71648c9f-0170-413c-9f26-d169c9933469" containerID="0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" exitCode=0 Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128154 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerDied","Data":"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc"} Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128184 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"71648c9f-0170-413c-9f26-d169c9933469","Type":"ContainerDied","Data":"a5e9daf344d7df70576896055f545ff598209ddc3efe1ce29774c938244e44a7"} Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128202 5094 scope.go:117] "RemoveContainer" containerID="67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.128199 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.153041 5094 scope.go:117] "RemoveContainer" containerID="0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.171562 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.180024 5094 scope.go:117] "RemoveContainer" containerID="67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.180519 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9\": container with ID starting with 67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9 not found: ID does not exist" containerID="67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.180569 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9"} err="failed to get container status \"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9\": rpc error: code = NotFound desc = could not find container \"67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9\": container with ID starting with 67148a4e684e1ddbbec96ddc574fd1f1d4b808c14bbd286654cdb053aeb29db9 not found: ID does not exist" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.180603 5094 scope.go:117] "RemoveContainer" containerID="0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.181137 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc\": container with ID starting with 0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc not found: ID does not exist" containerID="0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.181174 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc"} err="failed to get container status \"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc\": rpc error: code = NotFound desc = could not find container \"0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc\": container with ID starting with 0686cfd57649205f2e479e85d8bd4cc3d24d29e2bf21f8972dd18e12a66c7bbc not found: ID does not exist" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.201294 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.225805 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226233 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226254 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226279 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226286 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226293 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="init" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226299 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="init" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226309 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226315 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226332 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-api" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226341 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-api" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226351 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="probe" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226357 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="probe" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226365 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="dnsmasq-dns" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226371 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="dnsmasq-dns" Feb 20 07:08:34 crc kubenswrapper[5094]: E0220 07:08:34.226385 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="cinder-scheduler" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226392 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="cinder-scheduler" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226570 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226586 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-api" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226596 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6240f946-dbc4-4fdb-b831-23e76bfe2ebc" containerName="neutron-httpd" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226609 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b56f38-2467-4518-bdc3-d6ee665987da" containerName="barbican-api-log" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226624 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2a1712c-5268-4203-bab0-c427e96b217b" containerName="dnsmasq-dns" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226631 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="cinder-scheduler" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.226637 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="71648c9f-0170-413c-9f26-d169c9933469" containerName="probe" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.227649 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.233240 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.237792 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.318491 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.318789 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.318931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.319010 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.319318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.319534 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.421837 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.421933 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.421978 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.422003 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.422032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.422060 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.422440 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.427226 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.428805 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.429300 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.435156 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.446017 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") pod \"cinder-scheduler-0\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.562219 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:08:34 crc kubenswrapper[5094]: I0220 07:08:34.976069 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:08:35 crc kubenswrapper[5094]: I0220 07:08:35.123055 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:08:35 crc kubenswrapper[5094]: I0220 07:08:35.163919 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerStarted","Data":"e629f6202467daa64ea1c5522af0e65990925325c9e7a625f6d5ea287157f10f"} Feb 20 07:08:35 crc kubenswrapper[5094]: I0220 07:08:35.858354 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71648c9f-0170-413c-9f26-d169c9933469" path="/var/lib/kubelet/pods/71648c9f-0170-413c-9f26-d169c9933469/volumes" Feb 20 07:08:36 crc kubenswrapper[5094]: I0220 07:08:36.005200 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:36 crc kubenswrapper[5094]: I0220 07:08:36.009687 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:08:36 crc kubenswrapper[5094]: I0220 07:08:36.183575 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerStarted","Data":"2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7"} Feb 20 07:08:36 crc kubenswrapper[5094]: I0220 07:08:36.853151 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.192406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerStarted","Data":"0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23"} Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.219918 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.219889674 podStartE2EDuration="3.219889674s" podCreationTimestamp="2026-02-20 07:08:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:37.212193009 +0000 UTC m=+1332.084819720" watchObservedRunningTime="2026-02-20 07:08:37.219889674 +0000 UTC m=+1332.092516385" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.804527 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.805749 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.807919 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.808255 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-68tnv" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.808791 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.822757 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.990508 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.991020 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.991102 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:37 crc kubenswrapper[5094]: I0220 07:08:37.991158 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.092780 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.092885 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.092952 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.093004 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.093966 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.101091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.113071 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.117167 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") pod \"openstackclient\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.137433 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.491365 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.499950 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.518686 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.520164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.536186 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.605780 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.605830 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.605863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.605883 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.707564 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.707603 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.707640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.707663 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.708848 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.716517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.726336 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.732233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") pod \"openstackclient\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: I0220 07:08:38.847500 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:38 crc kubenswrapper[5094]: E0220 07:08:38.951254 5094 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 20 07:08:38 crc kubenswrapper[5094]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec_0(45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576" Netns:"/var/run/netns/9b9b216a-1781-43ac-9654-e754c1788f6b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576;K8S_POD_UID=6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576 network default NAD default] [openstack/openstackclient 45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:a7 [10.217.0.167/23] Feb 20 07:08:38 crc kubenswrapper[5094]: ' Feb 20 07:08:38 crc kubenswrapper[5094]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 20 07:08:38 crc kubenswrapper[5094]: > Feb 20 07:08:38 crc kubenswrapper[5094]: E0220 07:08:38.951362 5094 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 20 07:08:38 crc kubenswrapper[5094]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec_0(45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576" Netns:"/var/run/netns/9b9b216a-1781-43ac-9654-e754c1788f6b" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576;K8S_POD_UID=6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: [openstack/openstackclient/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec:ovn-kubernetes]: error adding container to network "ovn-kubernetes": CNI request failed with status 400: '[openstack/openstackclient 45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576 network default NAD default] [openstack/openstackclient 45cde7565f0964897c8b71abc1733e8bbf88261ce2cd0d6de493e6761dc91576 network default NAD default] failed to configure pod interface: canceled old pod sandbox waiting for OVS port binding for 0a:58:0a:d9:00:a7 [10.217.0.167/23] Feb 20 07:08:38 crc kubenswrapper[5094]: ' Feb 20 07:08:38 crc kubenswrapper[5094]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 20 07:08:38 crc kubenswrapper[5094]: > pod="openstack/openstackclient" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.287910 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.359509 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.378825 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.539067 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 07:08:39 crc kubenswrapper[5094]: W0220 07:08:39.543896 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf8ca33ba_f76e_4352_b6f1_54588dd25285.slice/crio-7a5b6df81f5a888ddc75f1e10288eb4ffa4908ce2890d2f759d537c7527243db WatchSource:0}: Error finding container 7a5b6df81f5a888ddc75f1e10288eb4ffa4908ce2890d2f759d537c7527243db: Status 404 returned error can't find the container with id 7a5b6df81f5a888ddc75f1e10288eb4ffa4908ce2890d2f759d537c7527243db Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.563120 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566044 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") pod \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566139 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") pod \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566188 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") pod \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566548 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") pod \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\" (UID: \"6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec\") " Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.566604 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" (UID: "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.567246 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.573802 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" (UID: "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.574457 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89" (OuterVolumeSpecName: "kube-api-access-w4p89") pod "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" (UID: "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec"). InnerVolumeSpecName "kube-api-access-w4p89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.576793 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" (UID: "6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.669117 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.669158 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4p89\" (UniqueName: \"kubernetes.io/projected/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-kube-api-access-w4p89\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.669184 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:39 crc kubenswrapper[5094]: I0220 07:08:39.852263 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" path="/var/lib/kubelet/pods/6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec/volumes" Feb 20 07:08:40 crc kubenswrapper[5094]: I0220 07:08:40.301979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f8ca33ba-f76e-4352-b6f1-54588dd25285","Type":"ContainerStarted","Data":"7a5b6df81f5a888ddc75f1e10288eb4ffa4908ce2890d2f759d537c7527243db"} Feb 20 07:08:40 crc kubenswrapper[5094]: I0220 07:08:40.302021 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:08:40 crc kubenswrapper[5094]: I0220 07:08:40.315521 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="6a1d52a6-2ed3-42c7-b3f3-2a7ca82c04ec" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.546071 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.548330 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.555989 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.556836 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.556963 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.560261 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.713820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714031 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714116 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714263 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714335 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714622 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714741 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.714811 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817130 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817227 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817261 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817292 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817333 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817368 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817406 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.817956 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.818333 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.824788 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.826155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.826818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.827231 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.827639 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.837610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") pod \"swift-proxy-6964856c75-f7xdp\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:41 crc kubenswrapper[5094]: I0220 07:08:41.872187 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.154344 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.155104 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-central-agent" containerID="cri-o://53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" gracePeriod=30 Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.157160 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="proxy-httpd" containerID="cri-o://a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" gracePeriod=30 Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.157218 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="sg-core" containerID="cri-o://3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" gracePeriod=30 Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.157240 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-notification-agent" containerID="cri-o://cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" gracePeriod=30 Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.163623 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 07:08:42 crc kubenswrapper[5094]: I0220 07:08:42.663086 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.335332 5094 generic.go:334] "Generic (PLEG): container finished" podID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerID="3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" exitCode=2 Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.335410 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.336301 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.336639 5094 generic.go:334] "Generic (PLEG): container finished" podID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerID="cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" exitCode=0 Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.336668 5094 generic.go:334] "Generic (PLEG): container finished" podID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerID="53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" exitCode=0 Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.336766 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339322 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerStarted","Data":"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339348 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerStarted","Data":"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339359 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerStarted","Data":"d9175bae901767e94daa13c4878599492cbc5a434fa253663ae2586b3df20eb3"} Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339687 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.339838 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.365239 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6964856c75-f7xdp" podStartSLOduration=2.36521837 podStartE2EDuration="2.36521837s" podCreationTimestamp="2026-02-20 07:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:43.36191687 +0000 UTC m=+1338.234543581" watchObservedRunningTime="2026-02-20 07:08:43.36521837 +0000 UTC m=+1338.237845081" Feb 20 07:08:43 crc kubenswrapper[5094]: I0220 07:08:43.983623 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094618 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094669 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094751 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094786 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094856 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.094979 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") pod \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\" (UID: \"efb81c29-b634-4b80-a18d-53ccfdd8dd40\") " Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.097924 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.098829 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.104843 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts" (OuterVolumeSpecName: "scripts") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.105319 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9" (OuterVolumeSpecName: "kube-api-access-vp4r9") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "kube-api-access-vp4r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.133841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197431 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vp4r9\" (UniqueName: \"kubernetes.io/projected/efb81c29-b634-4b80-a18d-53ccfdd8dd40-kube-api-access-vp4r9\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197475 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197500 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197512 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.197525 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb81c29-b634-4b80-a18d-53ccfdd8dd40-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.218779 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.229033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data" (OuterVolumeSpecName: "config-data") pod "efb81c29-b634-4b80-a18d-53ccfdd8dd40" (UID: "efb81c29-b634-4b80-a18d-53ccfdd8dd40"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.309536 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.309585 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb81c29-b634-4b80-a18d-53ccfdd8dd40-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358130 5094 generic.go:334] "Generic (PLEG): container finished" podID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerID="a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" exitCode=0 Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358209 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358260 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120"} Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358295 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb81c29-b634-4b80-a18d-53ccfdd8dd40","Type":"ContainerDied","Data":"45ba2822d7e80855ff79a5771c243c313af7e75204cbacc92b81dca39bd3e6ea"} Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.358313 5094 scope.go:117] "RemoveContainer" containerID="a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.396972 5094 scope.go:117] "RemoveContainer" containerID="3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.428839 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.444780 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.457552 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.458056 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="proxy-httpd" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458076 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="proxy-httpd" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.458090 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-central-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458099 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-central-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.458110 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="sg-core" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458117 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="sg-core" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.458127 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-notification-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458133 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-notification-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458331 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="sg-core" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458352 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="proxy-httpd" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458367 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-notification-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.458378 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" containerName="ceilometer-central-agent" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.473966 5094 scope.go:117] "RemoveContainer" containerID="cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.474161 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.476414 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.476982 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.479060 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.511100 5094 scope.go:117] "RemoveContainer" containerID="53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.556389 5094 scope.go:117] "RemoveContainer" containerID="a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.562978 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120\": container with ID starting with a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120 not found: ID does not exist" containerID="a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.563028 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120"} err="failed to get container status \"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120\": rpc error: code = NotFound desc = could not find container \"a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120\": container with ID starting with a4ea8efcf9ce84e8f850ee05a73654082ccde16843f950d6cef65dc52a82c120 not found: ID does not exist" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.563058 5094 scope.go:117] "RemoveContainer" containerID="3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.563664 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266\": container with ID starting with 3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266 not found: ID does not exist" containerID="3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.563885 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266"} err="failed to get container status \"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266\": rpc error: code = NotFound desc = could not find container \"3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266\": container with ID starting with 3ba1ebcfadf5f58f002b4207065f964408fa6f65e36f58017a720dfb37eee266 not found: ID does not exist" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.563969 5094 scope.go:117] "RemoveContainer" containerID="cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.564327 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f\": container with ID starting with cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f not found: ID does not exist" containerID="cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.564377 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f"} err="failed to get container status \"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f\": rpc error: code = NotFound desc = could not find container \"cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f\": container with ID starting with cd05d437d28e343e62d60fa6d0fe82e9917aa7dc5f2c11ce6e688c742734555f not found: ID does not exist" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.564414 5094 scope.go:117] "RemoveContainer" containerID="53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" Feb 20 07:08:44 crc kubenswrapper[5094]: E0220 07:08:44.564654 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8\": container with ID starting with 53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8 not found: ID does not exist" containerID="53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.564684 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8"} err="failed to get container status \"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8\": rpc error: code = NotFound desc = could not find container \"53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8\": container with ID starting with 53cbc5e72918f62d25b4fe3bae2605aaa2a03057afdb2e8c72b4e5f0c49f6ad8 not found: ID does not exist" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614563 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614718 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614745 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614762 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614872 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.614928 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717446 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717532 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717571 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717612 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717673 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717699 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.717743 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.718412 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.718841 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.727582 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.727820 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.728221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.728986 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.737851 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") pod \"ceilometer-0\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " pod="openstack/ceilometer-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.785947 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 07:08:44 crc kubenswrapper[5094]: I0220 07:08:44.813098 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:08:45 crc kubenswrapper[5094]: I0220 07:08:45.302299 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:45 crc kubenswrapper[5094]: I0220 07:08:45.374722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"72664ddb60921a7c82bcb5bf7c13f327aadde894768d3fc2b11fc7a9a5e76ba1"} Feb 20 07:08:45 crc kubenswrapper[5094]: I0220 07:08:45.878598 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb81c29-b634-4b80-a18d-53ccfdd8dd40" path="/var/lib/kubelet/pods/efb81c29-b634-4b80-a18d-53ccfdd8dd40/volumes" Feb 20 07:08:46 crc kubenswrapper[5094]: I0220 07:08:46.699169 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:46 crc kubenswrapper[5094]: I0220 07:08:46.699623 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-log" containerID="cri-o://d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea" gracePeriod=30 Feb 20 07:08:46 crc kubenswrapper[5094]: I0220 07:08:46.701854 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-httpd" containerID="cri-o://f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18" gracePeriod=30 Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.398651 5094 generic.go:334] "Generic (PLEG): container finished" podID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerID="d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea" exitCode=143 Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.398863 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerDied","Data":"d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea"} Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.726972 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.728496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.745048 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.830753 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.832544 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.852398 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.854092 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.855767 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.859863 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.866108 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.894084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.894770 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.928492 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.929730 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.986535 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.996277 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.996339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.996400 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.997483 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.997628 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:47 crc kubenswrapper[5094]: I0220 07:08:47.997691 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.000129 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.033206 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") pod \"nova-api-db-create-wrxqf\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.059855 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.061476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.064109 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.067616 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.084056 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.125068 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.125792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.125902 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126133 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126272 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126377 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.126412 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.128001 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.128309 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.156264 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.169873 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") pod \"nova-cell0-db-create-hlntr\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.177904 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") pod \"nova-api-d63e-account-create-update-hkz6h\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.182344 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.229121 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.229195 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.229273 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.229324 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.232318 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.235447 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.267275 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") pod \"nova-cell1-db-create-lb6l5\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.281409 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") pod \"nova-cell0-675a-account-create-update-7f6v9\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.294077 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.296268 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.302315 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.303185 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.324558 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.331919 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.332060 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.413929 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.435992 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.436108 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.437216 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.454173 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.479725 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") pod \"nova-cell1-8a74-account-create-update-cdhcw\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:48 crc kubenswrapper[5094]: I0220 07:08:48.668051 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:50 crc kubenswrapper[5094]: I0220 07:08:50.445869 5094 generic.go:334] "Generic (PLEG): container finished" podID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerID="f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18" exitCode=0 Feb 20 07:08:50 crc kubenswrapper[5094]: I0220 07:08:50.446229 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerDied","Data":"f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18"} Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.881690 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.884776 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.909518 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.909811 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-log" containerID="cri-o://3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7" gracePeriod=30 Feb 20 07:08:51 crc kubenswrapper[5094]: I0220 07:08:51.909961 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-httpd" containerID="cri-o://763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859" gracePeriod=30 Feb 20 07:08:52 crc kubenswrapper[5094]: I0220 07:08:52.528516 5094 generic.go:334] "Generic (PLEG): container finished" podID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerID="3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7" exitCode=143 Feb 20 07:08:52 crc kubenswrapper[5094]: I0220 07:08:52.528605 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerDied","Data":"3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7"} Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.124686 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159085 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159208 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159277 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159321 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159364 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159391 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159415 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.159442 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\" (UID: \"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b\") " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.162051 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs" (OuterVolumeSpecName: "logs") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.162245 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.178879 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.193362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7" (OuterVolumeSpecName: "kube-api-access-pkqb7") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "kube-api-access-pkqb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.197246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts" (OuterVolumeSpecName: "scripts") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.246388 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261848 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261887 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261900 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261909 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261919 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkqb7\" (UniqueName: \"kubernetes.io/projected/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-kube-api-access-pkqb7\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.261951 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.284977 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.290822 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data" (OuterVolumeSpecName: "config-data") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.294491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" (UID: "8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.364089 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.364459 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.364470 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.570212 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.574357 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.586899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f8ca33ba-f76e-4352-b6f1-54588dd25285","Type":"ContainerStarted","Data":"1db2a566904b3dc511755f2dbe6958ef04fbbaf45c4039fb8bb4f2d0b8ee27fe"} Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.619122 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc"} Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.673684 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b","Type":"ContainerDied","Data":"c37272ac6f9e924740c6d7aa103c2e64f6efab2b196866681821b669409f2ee4"} Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.673770 5094 scope.go:117] "RemoveContainer" containerID="f8a1428170b952f6757cb59aec0fc05c990728916f1e69bbff78756ee58aca18" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.674003 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.689004 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.761873 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.6335531469999998 podStartE2EDuration="16.761848919s" podCreationTimestamp="2026-02-20 07:08:38 +0000 UTC" firstStartedPulling="2026-02-20 07:08:39.546597646 +0000 UTC m=+1334.419224357" lastFinishedPulling="2026-02-20 07:08:53.674893418 +0000 UTC m=+1348.547520129" observedRunningTime="2026-02-20 07:08:54.64953982 +0000 UTC m=+1349.522166531" watchObservedRunningTime="2026-02-20 07:08:54.761848919 +0000 UTC m=+1349.634475630" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.828525 5094 scope.go:117] "RemoveContainer" containerID="d030747f97a82fa85fa58357fe602c22cf85f61f284bbee33cd22316a4d560ea" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.859431 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.908319 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.932379 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:54 crc kubenswrapper[5094]: E0220 07:08:54.933008 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-log" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.933025 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-log" Feb 20 07:08:54 crc kubenswrapper[5094]: E0220 07:08:54.933078 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-httpd" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.933084 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-httpd" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.933335 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-httpd" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.933350 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" containerName="glance-log" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.935278 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.938728 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.950570 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.951530 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.961291 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:08:54 crc kubenswrapper[5094]: I0220 07:08:54.975803 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.004037 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:08:55 crc kubenswrapper[5094]: W0220 07:08:55.066649 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fb81f20_1f88_4c11_a37a_31db4472afd2.slice/crio-76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a WatchSource:0}: Error finding container 76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a: Status 404 returned error can't find the container with id 76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.076262 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.086672 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.086984 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5ddb8575b6-4wznv" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-api" containerID="cri-o://eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931" gracePeriod=30 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.087560 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5ddb8575b6-4wznv" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-httpd" containerID="cri-o://419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a" gracePeriod=30 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.088600 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.088739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.088969 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089056 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089183 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089368 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089495 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.089693 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195678 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195730 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195811 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195860 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195943 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.195983 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.196001 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.197378 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.197992 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.199266 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.218679 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.218683 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.218832 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.237544 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.237960 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.276534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.310249 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.691023 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lb6l5" event={"ID":"43cfca6d-55e3-431f-b5b8-2b8db44bcee0","Type":"ContainerStarted","Data":"1616ce3dec6e69416e60df4bd5866fdeda9b06b012b17e0a3ceddd1a84ee25c5"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.691091 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lb6l5" event={"ID":"43cfca6d-55e3-431f-b5b8-2b8db44bcee0","Type":"ContainerStarted","Data":"a329d7a0add1e392fa4985f19f6e93cb9612c786391b2490ffb4a1810885dc7c"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.694533 5094 generic.go:334] "Generic (PLEG): container finished" podID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" containerID="6d3b6790676924518ae410dca9464ac17e8adb8be7c1c0809abd3e37c9afadec" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.694641 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-hkz6h" event={"ID":"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b","Type":"ContainerDied","Data":"6d3b6790676924518ae410dca9464ac17e8adb8be7c1c0809abd3e37c9afadec"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.694665 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-hkz6h" event={"ID":"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b","Type":"ContainerStarted","Data":"da11e79f81a9148438c1b80c421638abdc63b2dce3f3a517fc7c35f6a21bff49"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.717591 5094 generic.go:334] "Generic (PLEG): container finished" podID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" containerID="f74e9fd620d60bb7c55d8ca9b94a45b983b355d7aa77e6d394eb827e69cef1af" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.717904 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hlntr" event={"ID":"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9","Type":"ContainerDied","Data":"f74e9fd620d60bb7c55d8ca9b94a45b983b355d7aa77e6d394eb827e69cef1af"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.717946 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hlntr" event={"ID":"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9","Type":"ContainerStarted","Data":"4a19cbf70032a85b200b21fb37bf3d894dcadf7dd6710eb8d150415435e649c4"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.722182 5094 generic.go:334] "Generic (PLEG): container finished" podID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" containerID="514774461bdf76f918de93cfcbabf0b67e1bca119186db34bd24f1a423cf7e05" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.722266 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wrxqf" event={"ID":"32b230f5-7de4-450c-90e6-e9c18a0d9c0e","Type":"ContainerDied","Data":"514774461bdf76f918de93cfcbabf0b67e1bca119186db34bd24f1a423cf7e05"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.722303 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wrxqf" event={"ID":"32b230f5-7de4-450c-90e6-e9c18a0d9c0e","Type":"ContainerStarted","Data":"026cfb43d0585fc1caa48d4ac0edb6168f335e6a26fa3ac7ecc6a4d36f2a9e37"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.726218 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-lb6l5" podStartSLOduration=8.726194435 podStartE2EDuration="8.726194435s" podCreationTimestamp="2026-02-20 07:08:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:55.712130118 +0000 UTC m=+1350.584756829" watchObservedRunningTime="2026-02-20 07:08:55.726194435 +0000 UTC m=+1350.598821146" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.732822 5094 generic.go:334] "Generic (PLEG): container finished" podID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerID="763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.732936 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerDied","Data":"763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.736144 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" event={"ID":"50bf9176-b504-436f-a845-7ab55506a258","Type":"ContainerStarted","Data":"87c896e6058562619d75ebeaacab8cf5ef47914c9fd0a960cbfc01098b412e02"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.736191 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" event={"ID":"50bf9176-b504-436f-a845-7ab55506a258","Type":"ContainerStarted","Data":"65625c1b377aaa04513c4aa31e214dd00d79211619df4ac8f4928a84644d7951"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.738429 5094 generic.go:334] "Generic (PLEG): container finished" podID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerID="419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a" exitCode=0 Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.738475 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerDied","Data":"419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.744974 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" event={"ID":"7fb81f20-1f88-4c11-a37a-31db4472afd2","Type":"ContainerStarted","Data":"5c648081449bf27866a97d9ce26a3b96bf18b0b2aa6ccd28cfd7c308a8ad471b"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.745026 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" event={"ID":"7fb81f20-1f88-4c11-a37a-31db4472afd2","Type":"ContainerStarted","Data":"76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.754030 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed"} Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.847313 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" podStartSLOduration=7.847282845 podStartE2EDuration="7.847282845s" podCreationTimestamp="2026-02-20 07:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:55.817841529 +0000 UTC m=+1350.690468240" watchObservedRunningTime="2026-02-20 07:08:55.847282845 +0000 UTC m=+1350.719909556" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.880110 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b" path="/var/lib/kubelet/pods/8a59e3ae-7a18-436f-b0f8-a0dffa9bb24b/volumes" Feb 20 07:08:55 crc kubenswrapper[5094]: I0220 07:08:55.881024 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.219365 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377410 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377528 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377590 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377725 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377754 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377845 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.377914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") pod \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\" (UID: \"1b5ca0fd-39ce-46da-8a15-cf0d7265e060\") " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.379043 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.387569 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.388742 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs" (OuterVolumeSpecName: "logs") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.397021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4" (OuterVolumeSpecName: "kube-api-access-wzzp4") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "kube-api-access-wzzp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.402717 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts" (OuterVolumeSpecName: "scripts") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.426155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.450608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data" (OuterVolumeSpecName: "config-data") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.478817 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1b5ca0fd-39ce-46da-8a15-cf0d7265e060" (UID: "1b5ca0fd-39ce-46da-8a15-cf0d7265e060"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480322 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480360 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480370 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzzp4\" (UniqueName: \"kubernetes.io/projected/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-kube-api-access-wzzp4\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480403 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480413 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480421 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480430 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.480438 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1b5ca0fd-39ce-46da-8a15-cf0d7265e060-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.509844 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.582732 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.766969 5094 generic.go:334] "Generic (PLEG): container finished" podID="7fb81f20-1f88-4c11-a37a-31db4472afd2" containerID="5c648081449bf27866a97d9ce26a3b96bf18b0b2aa6ccd28cfd7c308a8ad471b" exitCode=0 Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.767042 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" event={"ID":"7fb81f20-1f88-4c11-a37a-31db4472afd2","Type":"ContainerDied","Data":"5c648081449bf27866a97d9ce26a3b96bf18b0b2aa6ccd28cfd7c308a8ad471b"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.770571 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.772724 5094 generic.go:334] "Generic (PLEG): container finished" podID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" containerID="1616ce3dec6e69416e60df4bd5866fdeda9b06b012b17e0a3ceddd1a84ee25c5" exitCode=0 Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.772764 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lb6l5" event={"ID":"43cfca6d-55e3-431f-b5b8-2b8db44bcee0","Type":"ContainerDied","Data":"1616ce3dec6e69416e60df4bd5866fdeda9b06b012b17e0a3ceddd1a84ee25c5"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.797015 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"1b5ca0fd-39ce-46da-8a15-cf0d7265e060","Type":"ContainerDied","Data":"d752ec9568b97c7a9a1e0ea7c10ce0973a08508f8beb84b53fb4fc5635c2706f"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.797082 5094 scope.go:117] "RemoveContainer" containerID="763800845ad8e2fe479385b37abae471ee069ec13ca28f866a8628fcda5f7859" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.797269 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.808261 5094 generic.go:334] "Generic (PLEG): container finished" podID="50bf9176-b504-436f-a845-7ab55506a258" containerID="87c896e6058562619d75ebeaacab8cf5ef47914c9fd0a960cbfc01098b412e02" exitCode=0 Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.808382 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" event={"ID":"50bf9176-b504-436f-a845-7ab55506a258","Type":"ContainerDied","Data":"87c896e6058562619d75ebeaacab8cf5ef47914c9fd0a960cbfc01098b412e02"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.831810 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerStarted","Data":"fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.832149 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerStarted","Data":"ca1059f5843b1683c2b383612bdd39e42db92c13986bf2a30fa4cc0e0bdde634"} Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.898247 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.913563 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.937352 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:56 crc kubenswrapper[5094]: E0220 07:08:56.949836 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-log" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.949873 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-log" Feb 20 07:08:56 crc kubenswrapper[5094]: E0220 07:08:56.949911 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-httpd" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.949917 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-httpd" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.950182 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-httpd" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.950202 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" containerName="glance-log" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.951208 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.956506 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.958971 5094 scope.go:117] "RemoveContainer" containerID="3903f2d2312370e437aca364ecb299fb413db5a66b7bc53bb3fb92764eec4ca7" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.959242 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 20 07:08:56 crc kubenswrapper[5094]: I0220 07:08:56.971508 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095582 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095683 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095732 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095806 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.095871 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.202770 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.203194 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.203813 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.203228 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204197 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204236 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204679 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204756 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204793 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.204854 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.205363 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.211253 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.216621 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.220935 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.227374 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.227529 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.261805 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.271028 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.335894 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.409592 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") pod \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.409867 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") pod \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\" (UID: \"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.410686 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" (UID: "0626209e-1ab6-4bd1-a5cc-35a2f6525e5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.443384 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc" (OuterVolumeSpecName: "kube-api-access-z7hfc") pod "0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" (UID: "0626209e-1ab6-4bd1-a5cc-35a2f6525e5b"). InnerVolumeSpecName "kube-api-access-z7hfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.513640 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.513681 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7hfc\" (UniqueName: \"kubernetes.io/projected/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b-kube-api-access-z7hfc\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.714743 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.785232 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.832960 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.836727 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") pod \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.836915 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") pod \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\" (UID: \"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.849478 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" (UID: "c32c030e-7f51-4f5e-a4bb-c27288d8f2e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.891549 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5ca0fd-39ce-46da-8a15-cf0d7265e060" path="/var/lib/kubelet/pods/1b5ca0fd-39ce-46da-8a15-cf0d7265e060/volumes" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.894362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg" (OuterVolumeSpecName: "kube-api-access-m24hg") pod "c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" (UID: "c32c030e-7f51-4f5e-a4bb-c27288d8f2e9"). InnerVolumeSpecName "kube-api-access-m24hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.947294 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") pod \"50bf9176-b504-436f-a845-7ab55506a258\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.947383 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") pod \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.947473 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") pod \"50bf9176-b504-436f-a845-7ab55506a258\" (UID: \"50bf9176-b504-436f-a845-7ab55506a258\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.947507 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") pod \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\" (UID: \"32b230f5-7de4-450c-90e6-e9c18a0d9c0e\") " Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.948301 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.948324 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m24hg\" (UniqueName: \"kubernetes.io/projected/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9-kube-api-access-m24hg\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.949564 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.968361 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50bf9176-b504-436f-a845-7ab55506a258" (UID: "50bf9176-b504-436f-a845-7ab55506a258"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.972201 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn" (OuterVolumeSpecName: "kube-api-access-fsqkn") pod "50bf9176-b504-436f-a845-7ab55506a258" (UID: "50bf9176-b504-436f-a845-7ab55506a258"). InnerVolumeSpecName "kube-api-access-fsqkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.972340 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32b230f5-7de4-450c-90e6-e9c18a0d9c0e" (UID: "32b230f5-7de4-450c-90e6-e9c18a0d9c0e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.973035 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-675a-account-create-update-7f6v9" event={"ID":"50bf9176-b504-436f-a845-7ab55506a258","Type":"ContainerDied","Data":"65625c1b377aaa04513c4aa31e214dd00d79211619df4ac8f4928a84644d7951"} Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.973176 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65625c1b377aaa04513c4aa31e214dd00d79211619df4ac8f4928a84644d7951" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.975166 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-hkz6h" event={"ID":"0626209e-1ab6-4bd1-a5cc-35a2f6525e5b","Type":"ContainerDied","Data":"da11e79f81a9148438c1b80c421638abdc63b2dce3f3a517fc7c35f6a21bff49"} Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.975238 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da11e79f81a9148438c1b80c421638abdc63b2dce3f3a517fc7c35f6a21bff49" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.975390 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-hkz6h" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.977758 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-hlntr" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.977829 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-hlntr" event={"ID":"c32c030e-7f51-4f5e-a4bb-c27288d8f2e9","Type":"ContainerDied","Data":"4a19cbf70032a85b200b21fb37bf3d894dcadf7dd6710eb8d150415435e649c4"} Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.977878 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a19cbf70032a85b200b21fb37bf3d894dcadf7dd6710eb8d150415435e649c4" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.979210 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-wrxqf" event={"ID":"32b230f5-7de4-450c-90e6-e9c18a0d9c0e","Type":"ContainerDied","Data":"026cfb43d0585fc1caa48d4ac0edb6168f335e6a26fa3ac7ecc6a4d36f2a9e37"} Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.979261 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="026cfb43d0585fc1caa48d4ac0edb6168f335e6a26fa3ac7ecc6a4d36f2a9e37" Feb 20 07:08:57 crc kubenswrapper[5094]: I0220 07:08:57.982438 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-wrxqf" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.026923 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j" (OuterVolumeSpecName: "kube-api-access-xg46j") pod "32b230f5-7de4-450c-90e6-e9c18a0d9c0e" (UID: "32b230f5-7de4-450c-90e6-e9c18a0d9c0e"). InnerVolumeSpecName "kube-api-access-xg46j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.061645 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50bf9176-b504-436f-a845-7ab55506a258-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.061682 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.061695 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsqkn\" (UniqueName: \"kubernetes.io/projected/50bf9176-b504-436f-a845-7ab55506a258-kube-api-access-fsqkn\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.061795 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg46j\" (UniqueName: \"kubernetes.io/projected/32b230f5-7de4-450c-90e6-e9c18a0d9c0e-kube-api-access-xg46j\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.204091 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.404281 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.483476 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") pod \"7fb81f20-1f88-4c11-a37a-31db4472afd2\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.483597 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") pod \"7fb81f20-1f88-4c11-a37a-31db4472afd2\" (UID: \"7fb81f20-1f88-4c11-a37a-31db4472afd2\") " Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.491084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k" (OuterVolumeSpecName: "kube-api-access-ztt8k") pod "7fb81f20-1f88-4c11-a37a-31db4472afd2" (UID: "7fb81f20-1f88-4c11-a37a-31db4472afd2"). InnerVolumeSpecName "kube-api-access-ztt8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.491261 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7fb81f20-1f88-4c11-a37a-31db4472afd2" (UID: "7fb81f20-1f88-4c11-a37a-31db4472afd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.500026 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.588318 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") pod \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.588563 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") pod \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\" (UID: \"43cfca6d-55e3-431f-b5b8-2b8db44bcee0\") " Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.588799 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "43cfca6d-55e3-431f-b5b8-2b8db44bcee0" (UID: "43cfca6d-55e3-431f-b5b8-2b8db44bcee0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.589852 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.589878 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztt8k\" (UniqueName: \"kubernetes.io/projected/7fb81f20-1f88-4c11-a37a-31db4472afd2-kube-api-access-ztt8k\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.589891 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7fb81f20-1f88-4c11-a37a-31db4472afd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.618239 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv" (OuterVolumeSpecName: "kube-api-access-t67jv") pod "43cfca6d-55e3-431f-b5b8-2b8db44bcee0" (UID: "43cfca6d-55e3-431f-b5b8-2b8db44bcee0"). InnerVolumeSpecName "kube-api-access-t67jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.692048 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t67jv\" (UniqueName: \"kubernetes.io/projected/43cfca6d-55e3-431f-b5b8-2b8db44bcee0-kube-api-access-t67jv\") on node \"crc\" DevicePath \"\"" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.992071 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lb6l5" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.992067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lb6l5" event={"ID":"43cfca6d-55e3-431f-b5b8-2b8db44bcee0","Type":"ContainerDied","Data":"a329d7a0add1e392fa4985f19f6e93cb9612c786391b2490ffb4a1810885dc7c"} Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.992157 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a329d7a0add1e392fa4985f19f6e93cb9612c786391b2490ffb4a1810885dc7c" Feb 20 07:08:58 crc kubenswrapper[5094]: I0220 07:08:58.994234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerStarted","Data":"d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.006322 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.006896 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8a74-account-create-update-cdhcw" event={"ID":"7fb81f20-1f88-4c11-a37a-31db4472afd2","Type":"ContainerDied","Data":"76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.007027 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76d709d0c242731b3e4167bd010802ba2e12adacc0ebc9c94ef4cef8fb201f1a" Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.020208 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.020188254 podStartE2EDuration="5.020188254s" podCreationTimestamp="2026-02-20 07:08:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:08:59.016868314 +0000 UTC m=+1353.889495025" watchObservedRunningTime="2026-02-20 07:08:59.020188254 +0000 UTC m=+1353.892814965" Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.020520 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerStarted","Data":"0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.021545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerStarted","Data":"85e8b39c918088f619ee9b44d81cb6828488069406841b038890d491ba98168a"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032073 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerStarted","Data":"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168"} Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032296 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-central-agent" containerID="cri-o://3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" gracePeriod=30 Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032372 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032445 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="proxy-httpd" containerID="cri-o://a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" gracePeriod=30 Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032488 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="sg-core" containerID="cri-o://5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" gracePeriod=30 Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.032528 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-notification-agent" containerID="cri-o://00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" gracePeriod=30 Feb 20 07:08:59 crc kubenswrapper[5094]: I0220 07:08:59.070922 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.088236484 podStartE2EDuration="15.070895948s" podCreationTimestamp="2026-02-20 07:08:44 +0000 UTC" firstStartedPulling="2026-02-20 07:08:45.314673488 +0000 UTC m=+1340.187300199" lastFinishedPulling="2026-02-20 07:08:57.297332952 +0000 UTC m=+1352.169959663" observedRunningTime="2026-02-20 07:08:59.065005727 +0000 UTC m=+1353.937632448" watchObservedRunningTime="2026-02-20 07:08:59.070895948 +0000 UTC m=+1353.943522659" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.072378 5094 generic.go:334] "Generic (PLEG): container finished" podID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerID="eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931" exitCode=0 Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.074463 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerDied","Data":"eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.085870 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerStarted","Data":"5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.090990 5094 generic.go:334] "Generic (PLEG): container finished" podID="7e271af7-d690-418f-a044-e9a87e519a5a" containerID="a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" exitCode=0 Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091031 5094 generic.go:334] "Generic (PLEG): container finished" podID="7e271af7-d690-418f-a044-e9a87e519a5a" containerID="5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" exitCode=2 Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091041 5094 generic.go:334] "Generic (PLEG): container finished" podID="7e271af7-d690-418f-a044-e9a87e519a5a" containerID="00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" exitCode=0 Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091550 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091722 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.091811 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed"} Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.124810 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.124784008 podStartE2EDuration="4.124784008s" podCreationTimestamp="2026-02-20 07:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:00.114842149 +0000 UTC m=+1354.987468860" watchObservedRunningTime="2026-02-20 07:09:00.124784008 +0000 UTC m=+1354.997410719" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.387505 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434416 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434488 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434530 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434638 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.434765 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") pod \"c5f4dc72-bb77-44c4-8058-4939958d7a48\" (UID: \"c5f4dc72-bb77-44c4-8058-4939958d7a48\") " Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.444254 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.449917 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4" (OuterVolumeSpecName: "kube-api-access-t5fx4") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "kube-api-access-t5fx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.495566 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.497284 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config" (OuterVolumeSpecName: "config") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.540671 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.540757 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5fx4\" (UniqueName: \"kubernetes.io/projected/c5f4dc72-bb77-44c4-8058-4939958d7a48-kube-api-access-t5fx4\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.540771 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.540786 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.546001 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c5f4dc72-bb77-44c4-8058-4939958d7a48" (UID: "c5f4dc72-bb77-44c4-8058-4939958d7a48"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:00 crc kubenswrapper[5094]: I0220 07:09:00.642914 5094 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c5f4dc72-bb77-44c4-8058-4939958d7a48-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.085184 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.105914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5ddb8575b6-4wznv" event={"ID":"c5f4dc72-bb77-44c4-8058-4939958d7a48","Type":"ContainerDied","Data":"7216118bc6764d988378bcc95f70afbc24e44597d2724806d33cbd64bb7f1c0b"} Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.106261 5094 scope.go:117] "RemoveContainer" containerID="419f643d1ccf43ee276ce35ac60b7269f52b21d76747dcf57ea44f551a26690a" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.106225 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5ddb8575b6-4wznv" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.113434 5094 generic.go:334] "Generic (PLEG): container finished" podID="7e271af7-d690-418f-a044-e9a87e519a5a" containerID="3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" exitCode=0 Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.115426 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.117325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc"} Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.117662 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e271af7-d690-418f-a044-e9a87e519a5a","Type":"ContainerDied","Data":"72664ddb60921a7c82bcb5bf7c13f327aadde894768d3fc2b11fc7a9a5e76ba1"} Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.140474 5094 scope.go:117] "RemoveContainer" containerID="eb7da5e5d31261eb9e520b471e20dcda54e4d83a9a60405577b7def9033da931" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.141901 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.152183 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.152262 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.152184 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5ddb8575b6-4wznv"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154251 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154368 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154606 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.154863 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") pod \"7e271af7-d690-418f-a044-e9a87e519a5a\" (UID: \"7e271af7-d690-418f-a044-e9a87e519a5a\") " Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.155565 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.155729 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.156492 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.156524 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e271af7-d690-418f-a044-e9a87e519a5a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.165340 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts" (OuterVolumeSpecName: "scripts") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.166030 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk" (OuterVolumeSpecName: "kube-api-access-7thfk") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "kube-api-access-7thfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.178665 5094 scope.go:117] "RemoveContainer" containerID="a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.260438 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.260472 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7thfk\" (UniqueName: \"kubernetes.io/projected/7e271af7-d690-418f-a044-e9a87e519a5a-kube-api-access-7thfk\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.276878 5094 scope.go:117] "RemoveContainer" containerID="5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.282883 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.348796 5094 scope.go:117] "RemoveContainer" containerID="00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.366830 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.418496 5094 scope.go:117] "RemoveContainer" containerID="3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.422189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.464104 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data" (OuterVolumeSpecName: "config-data") pod "7e271af7-d690-418f-a044-e9a87e519a5a" (UID: "7e271af7-d690-418f-a044-e9a87e519a5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.468373 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.468411 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e271af7-d690-418f-a044-e9a87e519a5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.470081 5094 scope.go:117] "RemoveContainer" containerID="a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.470622 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168\": container with ID starting with a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168 not found: ID does not exist" containerID="a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.470666 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168"} err="failed to get container status \"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168\": rpc error: code = NotFound desc = could not find container \"a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168\": container with ID starting with a832ec9199a5419d0ba4f2f2126d3b84e07fb41f365601017d388685421a0168 not found: ID does not exist" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.470712 5094 scope.go:117] "RemoveContainer" containerID="5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.471299 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178\": container with ID starting with 5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178 not found: ID does not exist" containerID="5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.471358 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178"} err="failed to get container status \"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178\": rpc error: code = NotFound desc = could not find container \"5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178\": container with ID starting with 5a0434c6a826a37d6e7bed76e99b0191ef536eaf264b53bd220828627c872178 not found: ID does not exist" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.471390 5094 scope.go:117] "RemoveContainer" containerID="00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.471766 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed\": container with ID starting with 00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed not found: ID does not exist" containerID="00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.471791 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed"} err="failed to get container status \"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed\": rpc error: code = NotFound desc = could not find container \"00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed\": container with ID starting with 00acae1bf2e8033c37174db166bb17e6c15833da2f59f9b70ebeb491b8ac41ed not found: ID does not exist" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.471807 5094 scope.go:117] "RemoveContainer" containerID="3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.473072 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc\": container with ID starting with 3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc not found: ID does not exist" containerID="3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.473109 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc"} err="failed to get container status \"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc\": rpc error: code = NotFound desc = could not find container \"3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc\": container with ID starting with 3f9647d815121349b11fd3acdfdd813a022852ebeb4d626a22c011ec079172fc not found: ID does not exist" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.752238 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.761509 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.773621 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774255 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50bf9176-b504-436f-a845-7ab55506a258" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774376 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="50bf9176-b504-436f-a845-7ab55506a258" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774454 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774518 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774577 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-notification-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774642 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-notification-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774795 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774862 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.774917 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-api" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.774964 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-api" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775015 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="sg-core" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775063 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="sg-core" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775120 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775176 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775233 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fb81f20-1f88-4c11-a37a-31db4472afd2" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775283 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fb81f20-1f88-4c11-a37a-31db4472afd2" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775340 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775395 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775451 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775505 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775577 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="proxy-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775644 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="proxy-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: E0220 07:09:01.775895 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-central-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.775951 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-central-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776369 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776459 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="50bf9176-b504-436f-a845-7ab55506a258" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776525 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776587 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="sg-core" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776649 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776739 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="proxy-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776795 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fb81f20-1f88-4c11-a37a-31db4472afd2" containerName="mariadb-account-create-update" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776861 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-notification-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776928 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-api" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.776995 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" containerName="neutron-httpd" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.777057 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" containerName="ceilometer-central-agent" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.777128 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" containerName="mariadb-database-create" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.780027 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.783299 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.783515 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.787264 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.854002 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e271af7-d690-418f-a044-e9a87e519a5a" path="/var/lib/kubelet/pods/7e271af7-d690-418f-a044-e9a87e519a5a/volumes" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.854908 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f4dc72-bb77-44c4-8058-4939958d7a48" path="/var/lib/kubelet/pods/c5f4dc72-bb77-44c4-8058-4939958d7a48/volumes" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.875986 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876271 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876387 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876803 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.876900 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.978651 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979148 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979184 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979249 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979327 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979363 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.979949 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.980063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.990545 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.990570 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.990685 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:01 crc kubenswrapper[5094]: I0220 07:09:01.996466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:02 crc kubenswrapper[5094]: I0220 07:09:02.008734 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") pod \"ceilometer-0\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " pod="openstack/ceilometer-0" Feb 20 07:09:02 crc kubenswrapper[5094]: I0220 07:09:02.100794 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:02 crc kubenswrapper[5094]: I0220 07:09:02.624002 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.140767 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"f94a0c0b0f09c18362f4c831a12784474e87ae445bfb68efe6344e1d738ee970"} Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.289945 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.292210 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.300666 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.301096 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.301389 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j928v" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.317673 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.409597 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.409665 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.409716 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.409816 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.511629 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.511696 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.511752 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.511849 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.520291 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.520934 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.522127 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.528860 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") pod \"nova-cell0-conductor-db-sync-rfczb\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:03 crc kubenswrapper[5094]: I0220 07:09:03.628368 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:04 crc kubenswrapper[5094]: I0220 07:09:04.185997 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25"} Feb 20 07:09:04 crc kubenswrapper[5094]: I0220 07:09:04.186927 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273"} Feb 20 07:09:04 crc kubenswrapper[5094]: I0220 07:09:04.252755 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:09:04 crc kubenswrapper[5094]: W0220 07:09:04.266168 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b534507_5d2d_496b_9a60_f0b45e25bb23.slice/crio-00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af WatchSource:0}: Error finding container 00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af: Status 404 returned error can't find the container with id 00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af Feb 20 07:09:04 crc kubenswrapper[5094]: I0220 07:09:04.761925 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.224740 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfczb" event={"ID":"9b534507-5d2d-496b-9a60-f0b45e25bb23","Type":"ContainerStarted","Data":"00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af"} Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.235441 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c"} Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.311225 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.311276 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.357479 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:05 crc kubenswrapper[5094]: I0220 07:09:05.362144 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249476 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerStarted","Data":"52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924"} Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249647 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-central-agent" containerID="cri-o://1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273" gracePeriod=30 Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249673 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" containerID="cri-o://52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924" gracePeriod=30 Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249733 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-notification-agent" containerID="cri-o://c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25" gracePeriod=30 Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.249746 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="sg-core" containerID="cri-o://92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c" gracePeriod=30 Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.250568 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.250625 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.250640 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:06 crc kubenswrapper[5094]: I0220 07:09:06.279622 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.952487478 podStartE2EDuration="5.27959667s" podCreationTimestamp="2026-02-20 07:09:01 +0000 UTC" firstStartedPulling="2026-02-20 07:09:02.64145068 +0000 UTC m=+1357.514077381" lastFinishedPulling="2026-02-20 07:09:05.968559862 +0000 UTC m=+1360.841186573" observedRunningTime="2026-02-20 07:09:06.273963936 +0000 UTC m=+1361.146590647" watchObservedRunningTime="2026-02-20 07:09:06.27959667 +0000 UTC m=+1361.152223381" Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.266882 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerID="92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c" exitCode=2 Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.267464 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerID="c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25" exitCode=0 Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.266939 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c"} Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.267583 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25"} Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.336773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.336844 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.463380 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 07:09:07 crc kubenswrapper[5094]: I0220 07:09:07.472184 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.278288 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.278349 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.482474 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.482980 5094 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 20 07:09:08 crc kubenswrapper[5094]: I0220 07:09:08.659302 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 07:09:10 crc kubenswrapper[5094]: I0220 07:09:10.252283 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 07:09:10 crc kubenswrapper[5094]: I0220 07:09:10.264020 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 07:09:16 crc kubenswrapper[5094]: I0220 07:09:16.435858 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfczb" event={"ID":"9b534507-5d2d-496b-9a60-f0b45e25bb23","Type":"ContainerStarted","Data":"a7698ff70c057c639a740972f6d9a16bf4a7bb2202bca0a958050e7b6bf16b01"} Feb 20 07:09:16 crc kubenswrapper[5094]: I0220 07:09:16.468779 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rfczb" podStartSLOduration=2.110722678 podStartE2EDuration="13.468751634s" podCreationTimestamp="2026-02-20 07:09:03 +0000 UTC" firstStartedPulling="2026-02-20 07:09:04.269283796 +0000 UTC m=+1359.141910497" lastFinishedPulling="2026-02-20 07:09:15.627312742 +0000 UTC m=+1370.499939453" observedRunningTime="2026-02-20 07:09:16.450873536 +0000 UTC m=+1371.323500277" watchObservedRunningTime="2026-02-20 07:09:16.468751634 +0000 UTC m=+1371.341378355" Feb 20 07:09:17 crc kubenswrapper[5094]: I0220 07:09:17.453546 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerID="1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273" exitCode=0 Feb 20 07:09:17 crc kubenswrapper[5094]: I0220 07:09:17.454920 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273"} Feb 20 07:09:22 crc kubenswrapper[5094]: I0220 07:09:22.978347 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:22 crc kubenswrapper[5094]: I0220 07:09:22.984625 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:22 crc kubenswrapper[5094]: I0220 07:09:22.990321 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.095311 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.095397 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.095424 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.198591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.199017 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.199043 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.199792 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.199818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.221556 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") pod \"redhat-marketplace-htpf7\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.307289 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:23 crc kubenswrapper[5094]: I0220 07:09:23.801900 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:24 crc kubenswrapper[5094]: I0220 07:09:24.527813 5094 generic.go:334] "Generic (PLEG): container finished" podID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerID="c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953" exitCode=0 Feb 20 07:09:24 crc kubenswrapper[5094]: I0220 07:09:24.527929 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerDied","Data":"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953"} Feb 20 07:09:24 crc kubenswrapper[5094]: I0220 07:09:24.528249 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerStarted","Data":"06256a1ce1be59544899f5fa410cb2b0ac1384aa6f324b0c9ca14e8938c56f75"} Feb 20 07:09:25 crc kubenswrapper[5094]: I0220 07:09:25.542403 5094 generic.go:334] "Generic (PLEG): container finished" podID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerID="21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db" exitCode=0 Feb 20 07:09:25 crc kubenswrapper[5094]: I0220 07:09:25.542480 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerDied","Data":"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db"} Feb 20 07:09:26 crc kubenswrapper[5094]: I0220 07:09:26.569629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerStarted","Data":"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592"} Feb 20 07:09:27 crc kubenswrapper[5094]: I0220 07:09:27.591930 5094 generic.go:334] "Generic (PLEG): container finished" podID="9b534507-5d2d-496b-9a60-f0b45e25bb23" containerID="a7698ff70c057c639a740972f6d9a16bf4a7bb2202bca0a958050e7b6bf16b01" exitCode=0 Feb 20 07:09:27 crc kubenswrapper[5094]: I0220 07:09:27.592255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfczb" event={"ID":"9b534507-5d2d-496b-9a60-f0b45e25bb23","Type":"ContainerDied","Data":"a7698ff70c057c639a740972f6d9a16bf4a7bb2202bca0a958050e7b6bf16b01"} Feb 20 07:09:27 crc kubenswrapper[5094]: I0220 07:09:27.635239 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-htpf7" podStartSLOduration=4.143455556 podStartE2EDuration="5.635201952s" podCreationTimestamp="2026-02-20 07:09:22 +0000 UTC" firstStartedPulling="2026-02-20 07:09:24.530865455 +0000 UTC m=+1379.403492176" lastFinishedPulling="2026-02-20 07:09:26.022611861 +0000 UTC m=+1380.895238572" observedRunningTime="2026-02-20 07:09:26.606379952 +0000 UTC m=+1381.479006683" watchObservedRunningTime="2026-02-20 07:09:27.635201952 +0000 UTC m=+1382.507828703" Feb 20 07:09:28 crc kubenswrapper[5094]: I0220 07:09:28.977266 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.058139 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") pod \"9b534507-5d2d-496b-9a60-f0b45e25bb23\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.070021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh" (OuterVolumeSpecName: "kube-api-access-k5bjh") pod "9b534507-5d2d-496b-9a60-f0b45e25bb23" (UID: "9b534507-5d2d-496b-9a60-f0b45e25bb23"). InnerVolumeSpecName "kube-api-access-k5bjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.159914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") pod \"9b534507-5d2d-496b-9a60-f0b45e25bb23\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.160099 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") pod \"9b534507-5d2d-496b-9a60-f0b45e25bb23\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.160266 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") pod \"9b534507-5d2d-496b-9a60-f0b45e25bb23\" (UID: \"9b534507-5d2d-496b-9a60-f0b45e25bb23\") " Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.161096 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5bjh\" (UniqueName: \"kubernetes.io/projected/9b534507-5d2d-496b-9a60-f0b45e25bb23-kube-api-access-k5bjh\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.166300 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts" (OuterVolumeSpecName: "scripts") pod "9b534507-5d2d-496b-9a60-f0b45e25bb23" (UID: "9b534507-5d2d-496b-9a60-f0b45e25bb23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.196091 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data" (OuterVolumeSpecName: "config-data") pod "9b534507-5d2d-496b-9a60-f0b45e25bb23" (UID: "9b534507-5d2d-496b-9a60-f0b45e25bb23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.219806 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b534507-5d2d-496b-9a60-f0b45e25bb23" (UID: "9b534507-5d2d-496b-9a60-f0b45e25bb23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.264014 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.264082 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.264109 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b534507-5d2d-496b-9a60-f0b45e25bb23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.650520 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rfczb" event={"ID":"9b534507-5d2d-496b-9a60-f0b45e25bb23","Type":"ContainerDied","Data":"00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af"} Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.650688 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00529373718eeeb13da913d19da79c13c97d537751cf2e72ac9762a9157f28af" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.651106 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rfczb" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.838662 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:09:29 crc kubenswrapper[5094]: E0220 07:09:29.839127 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b534507-5d2d-496b-9a60-f0b45e25bb23" containerName="nova-cell0-conductor-db-sync" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.839150 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b534507-5d2d-496b-9a60-f0b45e25bb23" containerName="nova-cell0-conductor-db-sync" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.839330 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b534507-5d2d-496b-9a60-f0b45e25bb23" containerName="nova-cell0-conductor-db-sync" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.840168 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.843380 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.856055 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.856807 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-j928v" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.880585 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.880652 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.880774 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.982171 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.982235 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.982307 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.987452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:29 crc kubenswrapper[5094]: I0220 07:09:29.987793 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:30 crc kubenswrapper[5094]: I0220 07:09:30.002522 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") pod \"nova-cell0-conductor-0\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:30 crc kubenswrapper[5094]: I0220 07:09:30.156322 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:30 crc kubenswrapper[5094]: I0220 07:09:30.628986 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:09:30 crc kubenswrapper[5094]: W0220 07:09:30.630164 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5bbb9ad_deeb_495f_9750_f7012c00061d.slice/crio-8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937 WatchSource:0}: Error finding container 8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937: Status 404 returned error can't find the container with id 8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937 Feb 20 07:09:30 crc kubenswrapper[5094]: I0220 07:09:30.663158 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5bbb9ad-deeb-495f-9750-f7012c00061d","Type":"ContainerStarted","Data":"8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937"} Feb 20 07:09:31 crc kubenswrapper[5094]: I0220 07:09:31.677648 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5bbb9ad-deeb-495f-9750-f7012c00061d","Type":"ContainerStarted","Data":"c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf"} Feb 20 07:09:31 crc kubenswrapper[5094]: I0220 07:09:31.678352 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:31 crc kubenswrapper[5094]: I0220 07:09:31.701835 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.7018136630000003 podStartE2EDuration="2.701813663s" podCreationTimestamp="2026-02-20 07:09:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:31.694279983 +0000 UTC m=+1386.566906694" watchObservedRunningTime="2026-02-20 07:09:31.701813663 +0000 UTC m=+1386.574440384" Feb 20 07:09:32 crc kubenswrapper[5094]: I0220 07:09:32.108819 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.308402 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.309021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.403494 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.787172 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:33 crc kubenswrapper[5094]: I0220 07:09:33.860340 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.209653 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.723635 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-htpf7" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="registry-server" containerID="cri-o://4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" gracePeriod=2 Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.829803 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.831277 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.838265 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.838401 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.855037 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.856910 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.857137 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.857191 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.861966 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.959550 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.959630 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.959841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.959894 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.980736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:35 crc kubenswrapper[5094]: I0220 07:09:35.981296 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:35.999930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.013387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") pod \"nova-cell0-cell-mapping-v6v6k\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.158337 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.174248 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.182172 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.182355 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.205242 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.235853 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.238010 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.242752 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.277569 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294057 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294171 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294191 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294306 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.294338 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.391452 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.393666 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.398765 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402460 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402572 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402677 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402712 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402754 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.402775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.404524 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.411233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.424414 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.426327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.428377 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.435061 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.437302 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.445806 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: E0220 07:09:36.446395 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="extract-content" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.446415 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="extract-content" Feb 20 07:09:36 crc kubenswrapper[5094]: E0220 07:09:36.446467 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="extract-utilities" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.446474 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="extract-utilities" Feb 20 07:09:36 crc kubenswrapper[5094]: E0220 07:09:36.446493 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="registry-server" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.446499 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="registry-server" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.446660 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerName="registry-server" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.447113 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") pod \"nova-api-0\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.458056 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.468217 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.483134 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.503884 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.505925 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") pod \"9a6d10f4-9048-4e6c-831c-8342e340d290\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.505958 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") pod \"9a6d10f4-9048-4e6c-831c-8342e340d290\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506084 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") pod \"9a6d10f4-9048-4e6c-831c-8342e340d290\" (UID: \"9a6d10f4-9048-4e6c-831c-8342e340d290\") " Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506287 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506346 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506409 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506442 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.506480 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.508688 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities" (OuterVolumeSpecName: "utilities") pod "9a6d10f4-9048-4e6c-831c-8342e340d290" (UID: "9a6d10f4-9048-4e6c-831c-8342e340d290"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.519877 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx" (OuterVolumeSpecName: "kube-api-access-b5fdx") pod "9a6d10f4-9048-4e6c-831c-8342e340d290" (UID: "9a6d10f4-9048-4e6c-831c-8342e340d290"). InnerVolumeSpecName "kube-api-access-b5fdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.591331 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.591682 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a6d10f4-9048-4e6c-831c-8342e340d290" (UID: "9a6d10f4-9048-4e6c-831c-8342e340d290"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.593584 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627379 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627802 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627873 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627927 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.627978 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628056 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628117 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628183 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628194 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a6d10f4-9048-4e6c-831c-8342e340d290-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.628204 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5fdx\" (UniqueName: \"kubernetes.io/projected/9a6d10f4-9048-4e6c-831c-8342e340d290-kube-api-access-b5fdx\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.635719 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.636954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.637652 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.638508 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.644780 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.657347 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.658591 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") pod \"nova-scheduler-0\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.677687 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.684212 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") pod \"nova-metadata-0\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.729410 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.729917 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.730177 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.731017 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.731077 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.731102 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.731121 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.747843 5094 generic.go:334] "Generic (PLEG): container finished" podID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerID="52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924" exitCode=137 Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.747935 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924"} Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753771 5094 generic.go:334] "Generic (PLEG): container finished" podID="9a6d10f4-9048-4e6c-831c-8342e340d290" containerID="4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" exitCode=0 Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753841 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerDied","Data":"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592"} Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753916 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-htpf7" event={"ID":"9a6d10f4-9048-4e6c-831c-8342e340d290","Type":"ContainerDied","Data":"06256a1ce1be59544899f5fa410cb2b0ac1384aa6f324b0c9ca14e8938c56f75"} Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753939 5094 scope.go:117] "RemoveContainer" containerID="4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.753983 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-htpf7" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.803639 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.832272 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-htpf7"] Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.834416 5094 scope.go:117] "RemoveContainer" containerID="21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.845494 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.856059 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.856193 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.856271 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.856674 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.857070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.857235 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.857908 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.858497 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.858558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.859097 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.859449 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.883898 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") pod \"dnsmasq-dns-75ddbf7c75-smx5j\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:36 crc kubenswrapper[5094]: I0220 07:09:36.970667 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.036692 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.079131 5094 scope.go:117] "RemoveContainer" containerID="c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.186449 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.207450 5094 scope.go:117] "RemoveContainer" containerID="4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.207983 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592\": container with ID starting with 4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592 not found: ID does not exist" containerID="4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208012 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592"} err="failed to get container status \"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592\": rpc error: code = NotFound desc = could not find container \"4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592\": container with ID starting with 4e36992ceacec2103fd06fe1ba8d785e7dc84b537e9b805e8337c088a4f67592 not found: ID does not exist" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208035 5094 scope.go:117] "RemoveContainer" containerID="21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.208468 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db\": container with ID starting with 21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db not found: ID does not exist" containerID="21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208529 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db"} err="failed to get container status \"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db\": rpc error: code = NotFound desc = could not find container \"21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db\": container with ID starting with 21b883bbf98254c856096d0b01c81fe32d0d73ad5cc8f16e030435fa52dfc0db not found: ID does not exist" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208562 5094 scope.go:117] "RemoveContainer" containerID="c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.208853 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953\": container with ID starting with c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953 not found: ID does not exist" containerID="c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.208919 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953"} err="failed to get container status \"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953\": rpc error: code = NotFound desc = could not find container \"c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953\": container with ID starting with c1adf8db45a1096e54fcc03867ab2eaebd032c062e599ce4d2e371f076654953 not found: ID does not exist" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271533 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271605 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271640 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271815 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271908 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.271942 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.272011 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") pod \"a0797f40-7313-479d-95ab-0a65d83b96d1\" (UID: \"a0797f40-7313-479d-95ab-0a65d83b96d1\") " Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.273183 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.273678 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.281137 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k" (OuterVolumeSpecName: "kube-api-access-bfk5k") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "kube-api-access-bfk5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.282692 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts" (OuterVolumeSpecName: "scripts") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.340389 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.340980 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-notification-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.340995 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-notification-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.341006 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="sg-core" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341012 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="sg-core" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.341039 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-central-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341047 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-central-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: E0220 07:09:37.341056 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341063 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341243 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-central-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341269 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="sg-core" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341288 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="ceilometer-notification-agent" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.341309 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" containerName="proxy-httpd" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.342314 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.345429 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.345439 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.352286 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.363100 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375019 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375198 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375225 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375346 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375365 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375472 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375508 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfk5k\" (UniqueName: \"kubernetes.io/projected/a0797f40-7313-479d-95ab-0a65d83b96d1-kube-api-access-bfk5k\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.375523 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0797f40-7313-479d-95ab-0a65d83b96d1-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.411257 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.428431 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data" (OuterVolumeSpecName: "config-data") pod "a0797f40-7313-479d-95ab-0a65d83b96d1" (UID: "a0797f40-7313-479d-95ab-0a65d83b96d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478071 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478252 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478303 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478329 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478430 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.478447 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0797f40-7313-479d-95ab-0a65d83b96d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.483564 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.484207 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.490416 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.507590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") pod \"nova-cell1-conductor-db-sync-pk97g\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.543938 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.571508 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.585530 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.665564 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.730367 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: W0220 07:09:37.744360 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1c7c378_78bc_48bd_932c_fa19cf4e6284.slice/crio-6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0 WatchSource:0}: Error finding container 6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0: Status 404 returned error can't find the container with id 6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0 Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.787541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0797f40-7313-479d-95ab-0a65d83b96d1","Type":"ContainerDied","Data":"f94a0c0b0f09c18362f4c831a12784474e87ae445bfb68efe6344e1d738ee970"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.789655 5094 scope.go:117] "RemoveContainer" containerID="52d0d24cfba05c590c4ce7431d5e10a363e5e5262aa05f4ff8f4f412c8232924" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.787868 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.805898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"043bc1d7-f57a-481d-b132-71ef45e85480","Type":"ContainerStarted","Data":"9e410d4605ab52b279cb99a108d57c885a2eae7e022b21ab722b6893f02390c6"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.808422 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bae6adb7-0e70-4689-aebe-d027da87abbb","Type":"ContainerStarted","Data":"d76d353af2545ad8202d965c9d3961ffa837c726f661aaff8084a4e9ecb335a6"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.810865 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.818951 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v6v6k" event={"ID":"61f66271-5ce9-4412-8ea3-9a63a934f307","Type":"ContainerStarted","Data":"20a41a106027a121889419317d3bee9adb413422fa7d51549986b6bb65338151"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.818983 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v6v6k" event={"ID":"61f66271-5ce9-4412-8ea3-9a63a934f307","Type":"ContainerStarted","Data":"80f14b4d6e479a64d507e140a2b75fccd3a83dbdb2db5beaa1b87cdc4abdeef2"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.825451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerStarted","Data":"6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.827194 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerStarted","Data":"4e611ddd33c80e60aea3bd50de5978b4641402d17394778695691b74d9d5ce40"} Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.844927 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-v6v6k" podStartSLOduration=2.844903016 podStartE2EDuration="2.844903016s" podCreationTimestamp="2026-02-20 07:09:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:37.842978009 +0000 UTC m=+1392.715604720" watchObservedRunningTime="2026-02-20 07:09:37.844903016 +0000 UTC m=+1392.717529727" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.849011 5094 scope.go:117] "RemoveContainer" containerID="92042acb82a2b22307c932b8e95e8c09d3491dbdbdfc8249a49e90755238923c" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.901162 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a6d10f4-9048-4e6c-831c-8342e340d290" path="/var/lib/kubelet/pods/9a6d10f4-9048-4e6c-831c-8342e340d290/volumes" Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.934346 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.945022 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.981803 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:37 crc kubenswrapper[5094]: I0220 07:09:37.988673 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001270 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001336 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001375 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001451 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001492 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.001540 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.002622 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.005641 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.021570 5094 scope.go:117] "RemoveContainer" containerID="c4793f3606e7baf873c3ad7d2141782aac8bcc7350dfd8b20839747be94d7f25" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.027100 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103062 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103121 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103196 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103225 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103251 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103300 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.103692 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.104141 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.112190 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.113368 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.116883 5094 scope.go:117] "RemoveContainer" containerID="1229efd8ed9ba1e9b30b374a627f3c7487afe05838a26a18548bf279f8959273" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.118541 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.121397 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.125353 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") pod \"ceilometer-0\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.187982 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:09:38 crc kubenswrapper[5094]: W0220 07:09:38.212185 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74c08d54_fdef_4808_bf52_f8ea0894af36.slice/crio-ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b WatchSource:0}: Error finding container ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b: Status 404 returned error can't find the container with id ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.371835 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.870434 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerID="f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e" exitCode=0 Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.872849 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerDied","Data":"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e"} Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.872917 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerStarted","Data":"b6d15829f95b8aff57b91d13f507a8fa1a3e6f6b0bdf9f807b5778fd0588a0ff"} Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.880821 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pk97g" event={"ID":"74c08d54-fdef-4808-bf52-f8ea0894af36","Type":"ContainerStarted","Data":"1dec1e1d1106c53f0481c59fed514648b5c6714553079cbf04bed733473a6862"} Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.880865 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pk97g" event={"ID":"74c08d54-fdef-4808-bf52-f8ea0894af36","Type":"ContainerStarted","Data":"ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b"} Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.925752 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:09:38 crc kubenswrapper[5094]: I0220 07:09:38.934816 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-pk97g" podStartSLOduration=1.934783247 podStartE2EDuration="1.934783247s" podCreationTimestamp="2026-02-20 07:09:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:38.917944604 +0000 UTC m=+1393.790571315" watchObservedRunningTime="2026-02-20 07:09:38.934783247 +0000 UTC m=+1393.807409958" Feb 20 07:09:39 crc kubenswrapper[5094]: I0220 07:09:39.858844 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0797f40-7313-479d-95ab-0a65d83b96d1" path="/var/lib/kubelet/pods/a0797f40-7313-479d-95ab-0a65d83b96d1/volumes" Feb 20 07:09:40 crc kubenswrapper[5094]: I0220 07:09:40.115598 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:09:40 crc kubenswrapper[5094]: I0220 07:09:40.185920 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:09:40 crc kubenswrapper[5094]: I0220 07:09:40.903570 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"33dad10331613824fe73aa48d99c70dc5318fba2b2e24e20fa4abae1bee74f21"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.928321 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerStarted","Data":"f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.931390 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.961268 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerStarted","Data":"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.965127 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.983833 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"043bc1d7-f57a-481d-b132-71ef45e85480","Type":"ContainerStarted","Data":"f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.984103 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5" gracePeriod=30 Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.994792 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bae6adb7-0e70-4689-aebe-d027da87abbb","Type":"ContainerStarted","Data":"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.999123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerStarted","Data":"8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561"} Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.999284 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-log" containerID="cri-o://8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561" gracePeriod=30 Feb 20 07:09:41 crc kubenswrapper[5094]: I0220 07:09:41.999531 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-metadata" containerID="cri-o://3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484" gracePeriod=30 Feb 20 07:09:42 crc kubenswrapper[5094]: I0220 07:09:42.002019 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" podStartSLOduration=6.001994385 podStartE2EDuration="6.001994385s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:41.98846035 +0000 UTC m=+1396.861087061" watchObservedRunningTime="2026-02-20 07:09:42.001994385 +0000 UTC m=+1396.874621096" Feb 20 07:09:42 crc kubenswrapper[5094]: I0220 07:09:42.016302 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.271058182 podStartE2EDuration="6.016278847s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="2026-02-20 07:09:37.537190786 +0000 UTC m=+1392.409817497" lastFinishedPulling="2026-02-20 07:09:41.282411451 +0000 UTC m=+1396.155038162" observedRunningTime="2026-02-20 07:09:42.008679295 +0000 UTC m=+1396.881306006" watchObservedRunningTime="2026-02-20 07:09:42.016278847 +0000 UTC m=+1396.888905558" Feb 20 07:09:42 crc kubenswrapper[5094]: I0220 07:09:42.057907 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.320484644 podStartE2EDuration="6.057880503s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="2026-02-20 07:09:37.551544619 +0000 UTC m=+1392.424171330" lastFinishedPulling="2026-02-20 07:09:41.288940478 +0000 UTC m=+1396.161567189" observedRunningTime="2026-02-20 07:09:42.037564816 +0000 UTC m=+1396.910191527" watchObservedRunningTime="2026-02-20 07:09:42.057880503 +0000 UTC m=+1396.930507204" Feb 20 07:09:42 crc kubenswrapper[5094]: I0220 07:09:42.065127 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.496635154 podStartE2EDuration="6.065104816s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="2026-02-20 07:09:37.748367594 +0000 UTC m=+1392.620994295" lastFinishedPulling="2026-02-20 07:09:41.316837246 +0000 UTC m=+1396.189463957" observedRunningTime="2026-02-20 07:09:42.059791769 +0000 UTC m=+1396.932418480" watchObservedRunningTime="2026-02-20 07:09:42.065104816 +0000 UTC m=+1396.937731527" Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.018368 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerStarted","Data":"ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e"} Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.022844 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5"} Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.027263 5094 generic.go:334] "Generic (PLEG): container finished" podID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerID="8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561" exitCode=143 Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.028815 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerDied","Data":"8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561"} Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.028872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerStarted","Data":"3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484"} Feb 20 07:09:43 crc kubenswrapper[5094]: I0220 07:09:43.071806 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.300036494 podStartE2EDuration="7.071779935s" podCreationTimestamp="2026-02-20 07:09:36 +0000 UTC" firstStartedPulling="2026-02-20 07:09:37.537256867 +0000 UTC m=+1392.409883568" lastFinishedPulling="2026-02-20 07:09:41.309000298 +0000 UTC m=+1396.181627009" observedRunningTime="2026-02-20 07:09:43.053817465 +0000 UTC m=+1397.926444216" watchObservedRunningTime="2026-02-20 07:09:43.071779935 +0000 UTC m=+1397.944406656" Feb 20 07:09:44 crc kubenswrapper[5094]: I0220 07:09:44.041022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf"} Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.080516 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerStarted","Data":"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4"} Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.081021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.082907 5094 generic.go:334] "Generic (PLEG): container finished" podID="61f66271-5ce9-4412-8ea3-9a63a934f307" containerID="20a41a106027a121889419317d3bee9adb413422fa7d51549986b6bb65338151" exitCode=0 Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.082953 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v6v6k" event={"ID":"61f66271-5ce9-4412-8ea3-9a63a934f307","Type":"ContainerDied","Data":"20a41a106027a121889419317d3bee9adb413422fa7d51549986b6bb65338151"} Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.107930 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.308940555 podStartE2EDuration="9.107903638s" podCreationTimestamp="2026-02-20 07:09:37 +0000 UTC" firstStartedPulling="2026-02-20 07:09:41.147139751 +0000 UTC m=+1396.019766462" lastFinishedPulling="2026-02-20 07:09:44.946102794 +0000 UTC m=+1399.818729545" observedRunningTime="2026-02-20 07:09:46.103084663 +0000 UTC m=+1400.975711384" watchObservedRunningTime="2026-02-20 07:09:46.107903638 +0000 UTC m=+1400.980530349" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.659513 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.680019 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.680124 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.730996 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.731104 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.769330 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.847532 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.847902 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:09:46 crc kubenswrapper[5094]: I0220 07:09:46.973012 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.059007 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.059299 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="dnsmasq-dns" containerID="cri-o://fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" gracePeriod=10 Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.121767 5094 generic.go:334] "Generic (PLEG): container finished" podID="74c08d54-fdef-4808-bf52-f8ea0894af36" containerID="1dec1e1d1106c53f0481c59fed514648b5c6714553079cbf04bed733473a6862" exitCode=0 Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.123282 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pk97g" event={"ID":"74c08d54-fdef-4808-bf52-f8ea0894af36","Type":"ContainerDied","Data":"1dec1e1d1106c53f0481c59fed514648b5c6714553079cbf04bed733473a6862"} Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.177776 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.713511 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.719512 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.764913 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.765023 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.815845 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") pod \"61f66271-5ce9-4412-8ea3-9a63a934f307\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.815962 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") pod \"61f66271-5ce9-4412-8ea3-9a63a934f307\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816072 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816209 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816252 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816368 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816454 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") pod \"61f66271-5ce9-4412-8ea3-9a63a934f307\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816524 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816570 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") pod \"61f66271-5ce9-4412-8ea3-9a63a934f307\" (UID: \"61f66271-5ce9-4412-8ea3-9a63a934f307\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.816598 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") pod \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\" (UID: \"864952a5-1f2e-4930-be7f-7dbc3a2c2af8\") " Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.868262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts" (OuterVolumeSpecName: "scripts") pod "61f66271-5ce9-4412-8ea3-9a63a934f307" (UID: "61f66271-5ce9-4412-8ea3-9a63a934f307"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.873287 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp" (OuterVolumeSpecName: "kube-api-access-pw5mp") pod "61f66271-5ce9-4412-8ea3-9a63a934f307" (UID: "61f66271-5ce9-4412-8ea3-9a63a934f307"). InnerVolumeSpecName "kube-api-access-pw5mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.874644 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt" (OuterVolumeSpecName: "kube-api-access-dmkxt") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "kube-api-access-dmkxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.891908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data" (OuterVolumeSpecName: "config-data") pod "61f66271-5ce9-4412-8ea3-9a63a934f307" (UID: "61f66271-5ce9-4412-8ea3-9a63a934f307"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.896841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61f66271-5ce9-4412-8ea3-9a63a934f307" (UID: "61f66271-5ce9-4412-8ea3-9a63a934f307"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.930574 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.932624 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw5mp\" (UniqueName: \"kubernetes.io/projected/61f66271-5ce9-4412-8ea3-9a63a934f307-kube-api-access-pw5mp\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.932721 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.932804 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmkxt\" (UniqueName: \"kubernetes.io/projected/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-kube-api-access-dmkxt\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.932870 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61f66271-5ce9-4412-8ea3-9a63a934f307-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.934636 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.947652 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.969003 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config" (OuterVolumeSpecName: "config") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.976258 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:47 crc kubenswrapper[5094]: I0220 07:09:47.977180 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "864952a5-1f2e-4930-be7f-7dbc3a2c2af8" (UID: "864952a5-1f2e-4930-be7f-7dbc3a2c2af8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034805 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034835 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034845 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034860 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.034871 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/864952a5-1f2e-4930-be7f-7dbc3a2c2af8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.133619 5094 generic.go:334] "Generic (PLEG): container finished" podID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerID="fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" exitCode=0 Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.133671 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerDied","Data":"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6"} Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.134236 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" event={"ID":"864952a5-1f2e-4930-be7f-7dbc3a2c2af8","Type":"ContainerDied","Data":"db6c4cfd73d84c6bf37834db171aab681839cbd0872d2e8b1d00c5c8feb0f4da"} Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.134264 5094 scope.go:117] "RemoveContainer" containerID="fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.135952 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-v6v6k" event={"ID":"61f66271-5ce9-4412-8ea3-9a63a934f307","Type":"ContainerDied","Data":"80f14b4d6e479a64d507e140a2b75fccd3a83dbdb2db5beaa1b87cdc4abdeef2"} Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.136292 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80f14b4d6e479a64d507e140a2b75fccd3a83dbdb2db5beaa1b87cdc4abdeef2" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.138011 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.138442 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-v6v6k" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.178916 5094 scope.go:117] "RemoveContainer" containerID="e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.281220 5094 scope.go:117] "RemoveContainer" containerID="fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" Feb 20 07:09:48 crc kubenswrapper[5094]: E0220 07:09:48.282439 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6\": container with ID starting with fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6 not found: ID does not exist" containerID="fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.282471 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6"} err="failed to get container status \"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6\": rpc error: code = NotFound desc = could not find container \"fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6\": container with ID starting with fdf1995e90b86d0bcffee929097bd15c99141d0d2d0b08867fe0841afc3d59b6 not found: ID does not exist" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.282494 5094 scope.go:117] "RemoveContainer" containerID="e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723" Feb 20 07:09:48 crc kubenswrapper[5094]: E0220 07:09:48.283871 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723\": container with ID starting with e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723 not found: ID does not exist" containerID="e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.283938 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723"} err="failed to get container status \"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723\": rpc error: code = NotFound desc = could not find container \"e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723\": container with ID starting with e3d0972b678e16469c3366801fb2f8b9511912679cef680d0f581c3085fd2723 not found: ID does not exist" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.345124 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.368214 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c8dc7b4d9-xbwsw"] Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.380075 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.391600 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.391856 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" containerID="cri-o://f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01" gracePeriod=30 Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.392024 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" containerID="cri-o://ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e" gracePeriod=30 Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.593372 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.649982 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") pod \"74c08d54-fdef-4808-bf52-f8ea0894af36\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.651613 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") pod \"74c08d54-fdef-4808-bf52-f8ea0894af36\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.651799 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") pod \"74c08d54-fdef-4808-bf52-f8ea0894af36\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.652049 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") pod \"74c08d54-fdef-4808-bf52-f8ea0894af36\" (UID: \"74c08d54-fdef-4808-bf52-f8ea0894af36\") " Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.670326 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts" (OuterVolumeSpecName: "scripts") pod "74c08d54-fdef-4808-bf52-f8ea0894af36" (UID: "74c08d54-fdef-4808-bf52-f8ea0894af36"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.685075 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z" (OuterVolumeSpecName: "kube-api-access-ppk9z") pod "74c08d54-fdef-4808-bf52-f8ea0894af36" (UID: "74c08d54-fdef-4808-bf52-f8ea0894af36"). InnerVolumeSpecName "kube-api-access-ppk9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.692352 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data" (OuterVolumeSpecName: "config-data") pod "74c08d54-fdef-4808-bf52-f8ea0894af36" (UID: "74c08d54-fdef-4808-bf52-f8ea0894af36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.724814 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74c08d54-fdef-4808-bf52-f8ea0894af36" (UID: "74c08d54-fdef-4808-bf52-f8ea0894af36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.756797 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.756837 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppk9z\" (UniqueName: \"kubernetes.io/projected/74c08d54-fdef-4808-bf52-f8ea0894af36-kube-api-access-ppk9z\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.756851 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:48 crc kubenswrapper[5094]: I0220 07:09:48.756862 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74c08d54-fdef-4808-bf52-f8ea0894af36-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.161161 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-pk97g" event={"ID":"74c08d54-fdef-4808-bf52-f8ea0894af36","Type":"ContainerDied","Data":"ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b"} Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.161579 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff75344bbc9886f9a77e9dcd22f9fe49181ad6479e98427b7a131e2ff1510a0b" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.161219 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-pk97g" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.167940 5094 generic.go:334] "Generic (PLEG): container finished" podID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerID="f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01" exitCode=143 Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.168827 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerDied","Data":"f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01"} Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.169103 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" containerID="cri-o://f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" gracePeriod=30 Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.259862 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:09:49 crc kubenswrapper[5094]: E0220 07:09:49.260554 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c08d54-fdef-4808-bf52-f8ea0894af36" containerName="nova-cell1-conductor-db-sync" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260573 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c08d54-fdef-4808-bf52-f8ea0894af36" containerName="nova-cell1-conductor-db-sync" Feb 20 07:09:49 crc kubenswrapper[5094]: E0220 07:09:49.260617 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61f66271-5ce9-4412-8ea3-9a63a934f307" containerName="nova-manage" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260626 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="61f66271-5ce9-4412-8ea3-9a63a934f307" containerName="nova-manage" Feb 20 07:09:49 crc kubenswrapper[5094]: E0220 07:09:49.260645 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="dnsmasq-dns" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260653 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="dnsmasq-dns" Feb 20 07:09:49 crc kubenswrapper[5094]: E0220 07:09:49.260670 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="init" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260676 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="init" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260925 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c08d54-fdef-4808-bf52-f8ea0894af36" containerName="nova-cell1-conductor-db-sync" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260954 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" containerName="dnsmasq-dns" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.260972 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="61f66271-5ce9-4412-8ea3-9a63a934f307" containerName="nova-manage" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.261866 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.264855 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.272370 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.374596 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.375133 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.375303 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.476828 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.476943 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.477096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.484774 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.485931 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.494536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") pod \"nova-cell1-conductor-0\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.584166 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:49 crc kubenswrapper[5094]: I0220 07:09:49.854546 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="864952a5-1f2e-4930-be7f-7dbc3a2c2af8" path="/var/lib/kubelet/pods/864952a5-1f2e-4930-be7f-7dbc3a2c2af8/volumes" Feb 20 07:09:50 crc kubenswrapper[5094]: I0220 07:09:50.098854 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:09:50 crc kubenswrapper[5094]: W0220 07:09:50.112553 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3caa33a_a0ec_4fdc_876b_266724a5af50.slice/crio-04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766 WatchSource:0}: Error finding container 04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766: Status 404 returned error can't find the container with id 04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766 Feb 20 07:09:50 crc kubenswrapper[5094]: I0220 07:09:50.183227 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3caa33a-a0ec-4fdc-876b-266724a5af50","Type":"ContainerStarted","Data":"04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766"} Feb 20 07:09:51 crc kubenswrapper[5094]: I0220 07:09:51.199926 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3caa33a-a0ec-4fdc-876b-266724a5af50","Type":"ContainerStarted","Data":"d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244"} Feb 20 07:09:51 crc kubenswrapper[5094]: I0220 07:09:51.200371 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:51 crc kubenswrapper[5094]: I0220 07:09:51.238894 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.238862841 podStartE2EDuration="2.238862841s" podCreationTimestamp="2026-02-20 07:09:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:51.230748456 +0000 UTC m=+1406.103375167" watchObservedRunningTime="2026-02-20 07:09:51.238862841 +0000 UTC m=+1406.111489562" Feb 20 07:09:51 crc kubenswrapper[5094]: E0220 07:09:51.736915 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:09:51 crc kubenswrapper[5094]: E0220 07:09:51.739381 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:09:51 crc kubenswrapper[5094]: E0220 07:09:51.741645 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:09:51 crc kubenswrapper[5094]: E0220 07:09:51.741835 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.229039 5094 generic.go:334] "Generic (PLEG): container finished" podID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerID="ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e" exitCode=0 Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.229159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerDied","Data":"ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e"} Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.229727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"049a4617-20ed-4f86-a0c4-a3a59bd44f26","Type":"ContainerDied","Data":"4e611ddd33c80e60aea3bd50de5978b4641402d17394778695691b74d9d5ce40"} Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.229746 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e611ddd33c80e60aea3bd50de5978b4641402d17394778695691b74d9d5ce40" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.425442 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.492447 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") pod \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.492544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") pod \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.492608 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") pod \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.492760 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") pod \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\" (UID: \"049a4617-20ed-4f86-a0c4-a3a59bd44f26\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.493684 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs" (OuterVolumeSpecName: "logs") pod "049a4617-20ed-4f86-a0c4-a3a59bd44f26" (UID: "049a4617-20ed-4f86-a0c4-a3a59bd44f26"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.495478 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/049a4617-20ed-4f86-a0c4-a3a59bd44f26-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.535751 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data" (OuterVolumeSpecName: "config-data") pod "049a4617-20ed-4f86-a0c4-a3a59bd44f26" (UID: "049a4617-20ed-4f86-a0c4-a3a59bd44f26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.540547 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6" (OuterVolumeSpecName: "kube-api-access-sw5c6") pod "049a4617-20ed-4f86-a0c4-a3a59bd44f26" (UID: "049a4617-20ed-4f86-a0c4-a3a59bd44f26"). InnerVolumeSpecName "kube-api-access-sw5c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.544679 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "049a4617-20ed-4f86-a0c4-a3a59bd44f26" (UID: "049a4617-20ed-4f86-a0c4-a3a59bd44f26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.597748 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.597800 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw5c6\" (UniqueName: \"kubernetes.io/projected/049a4617-20ed-4f86-a0c4-a3a59bd44f26-kube-api-access-sw5c6\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.597813 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/049a4617-20ed-4f86-a0c4-a3a59bd44f26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.602134 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.698834 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") pod \"bae6adb7-0e70-4689-aebe-d027da87abbb\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.698909 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") pod \"bae6adb7-0e70-4689-aebe-d027da87abbb\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.699157 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") pod \"bae6adb7-0e70-4689-aebe-d027da87abbb\" (UID: \"bae6adb7-0e70-4689-aebe-d027da87abbb\") " Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.704580 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n" (OuterVolumeSpecName: "kube-api-access-vdz9n") pod "bae6adb7-0e70-4689-aebe-d027da87abbb" (UID: "bae6adb7-0e70-4689-aebe-d027da87abbb"). InnerVolumeSpecName "kube-api-access-vdz9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.726050 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bae6adb7-0e70-4689-aebe-d027da87abbb" (UID: "bae6adb7-0e70-4689-aebe-d027da87abbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.739108 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data" (OuterVolumeSpecName: "config-data") pod "bae6adb7-0e70-4689-aebe-d027da87abbb" (UID: "bae6adb7-0e70-4689-aebe-d027da87abbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.804413 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.804460 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdz9n\" (UniqueName: \"kubernetes.io/projected/bae6adb7-0e70-4689-aebe-d027da87abbb-kube-api-access-vdz9n\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:53 crc kubenswrapper[5094]: I0220 07:09:53.804478 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bae6adb7-0e70-4689-aebe-d027da87abbb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.242517 5094 generic.go:334] "Generic (PLEG): container finished" podID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" exitCode=0 Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.243034 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.244389 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.244833 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bae6adb7-0e70-4689-aebe-d027da87abbb","Type":"ContainerDied","Data":"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85"} Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.244866 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bae6adb7-0e70-4689-aebe-d027da87abbb","Type":"ContainerDied","Data":"d76d353af2545ad8202d965c9d3961ffa837c726f661aaff8084a4e9ecb335a6"} Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.244886 5094 scope.go:117] "RemoveContainer" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.284219 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.291096 5094 scope.go:117] "RemoveContainer" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" Feb 20 07:09:54 crc kubenswrapper[5094]: E0220 07:09:54.291622 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85\": container with ID starting with f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85 not found: ID does not exist" containerID="f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.291683 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85"} err="failed to get container status \"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85\": rpc error: code = NotFound desc = could not find container \"f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85\": container with ID starting with f58ed2f0a80c2e018e48b0ad957c62ab9bd7aa5d88b44dabd89b67964ecb7d85 not found: ID does not exist" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.309275 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.330926 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.349383 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.366291 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: E0220 07:09:54.367284 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367311 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" Feb 20 07:09:54 crc kubenswrapper[5094]: E0220 07:09:54.367528 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367539 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" Feb 20 07:09:54 crc kubenswrapper[5094]: E0220 07:09:54.367571 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367580 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367845 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-api" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367858 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" containerName="nova-api-log" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.367871 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" containerName="nova-scheduler-scheduler" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.369067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.374181 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.397537 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.408067 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.409944 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.412567 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.423855 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.522972 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523022 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523220 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523333 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.523799 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627328 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627389 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627487 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627517 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627549 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627569 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.627594 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.629646 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.638201 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.638900 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.639745 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.648990 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") pod \"nova-scheduler-0\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.649683 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.660569 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") pod \"nova-api-0\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " pod="openstack/nova-api-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.709011 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:09:54 crc kubenswrapper[5094]: I0220 07:09:54.729081 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.257562 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.259002 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1c68ac6-9f96-4a39-b477-7ad74a04dff9","Type":"ContainerStarted","Data":"1ceadaa23027d492c09e5b48aebf25f3f67173315364e5177caaee816d00585c"} Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.260251 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerStarted","Data":"7169a135be999d7785f50c1ff2763f918e250f0425691f0c798f5fb9269603bf"} Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.272467 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.863048 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="049a4617-20ed-4f86-a0c4-a3a59bd44f26" path="/var/lib/kubelet/pods/049a4617-20ed-4f86-a0c4-a3a59bd44f26/volumes" Feb 20 07:09:55 crc kubenswrapper[5094]: I0220 07:09:55.864238 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bae6adb7-0e70-4689-aebe-d027da87abbb" path="/var/lib/kubelet/pods/bae6adb7-0e70-4689-aebe-d027da87abbb/volumes" Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.276651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerStarted","Data":"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b"} Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.276775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerStarted","Data":"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c"} Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.281170 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1c68ac6-9f96-4a39-b477-7ad74a04dff9","Type":"ContainerStarted","Data":"af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f"} Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.312547 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.312518921 podStartE2EDuration="2.312518921s" podCreationTimestamp="2026-02-20 07:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:56.305522144 +0000 UTC m=+1411.178148895" watchObservedRunningTime="2026-02-20 07:09:56.312518921 +0000 UTC m=+1411.185145632" Feb 20 07:09:56 crc kubenswrapper[5094]: I0220 07:09:56.339630 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.33960481 podStartE2EDuration="2.33960481s" podCreationTimestamp="2026-02-20 07:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:09:56.326493686 +0000 UTC m=+1411.199120407" watchObservedRunningTime="2026-02-20 07:09:56.33960481 +0000 UTC m=+1411.212231531" Feb 20 07:09:59 crc kubenswrapper[5094]: I0220 07:09:59.634043 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 07:09:59 crc kubenswrapper[5094]: I0220 07:09:59.709931 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.106928 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.107904 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.709397 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.730475 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.730533 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:04 crc kubenswrapper[5094]: I0220 07:10:04.755163 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 07:10:05 crc kubenswrapper[5094]: I0220 07:10:05.432895 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 07:10:05 crc kubenswrapper[5094]: I0220 07:10:05.814868 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:05 crc kubenswrapper[5094]: I0220 07:10:05.814874 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:08 crc kubenswrapper[5094]: I0220 07:10:08.380792 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.492205 5094 generic.go:334] "Generic (PLEG): container finished" podID="043bc1d7-f57a-481d-b132-71ef45e85480" containerID="f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5" exitCode=137 Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.492392 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"043bc1d7-f57a-481d-b132-71ef45e85480","Type":"ContainerDied","Data":"f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5"} Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.492937 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"043bc1d7-f57a-481d-b132-71ef45e85480","Type":"ContainerDied","Data":"9e410d4605ab52b279cb99a108d57c885a2eae7e022b21ab722b6893f02390c6"} Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.492960 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e410d4605ab52b279cb99a108d57c885a2eae7e022b21ab722b6893f02390c6" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.496242 5094 generic.go:334] "Generic (PLEG): container finished" podID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerID="3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484" exitCode=137 Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.496274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerDied","Data":"3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484"} Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.496296 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c1c7c378-78bc-48bd-932c-fa19cf4e6284","Type":"ContainerDied","Data":"6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0"} Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.496306 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f8e319488735091a18c4719ad1bc421ed45fbf1898270c70449e004acaf42c0" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.539000 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.548474 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.589989 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.590311 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="715094df-6704-4332-b990-95d790fd5ff1" containerName="kube-state-metrics" containerID="cri-o://ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" gracePeriod=30 Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734186 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") pod \"043bc1d7-f57a-481d-b132-71ef45e85480\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734318 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") pod \"043bc1d7-f57a-481d-b132-71ef45e85480\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734365 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") pod \"043bc1d7-f57a-481d-b132-71ef45e85480\" (UID: \"043bc1d7-f57a-481d-b132-71ef45e85480\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") pod \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734501 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") pod \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734555 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") pod \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.734593 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") pod \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\" (UID: \"c1c7c378-78bc-48bd-932c-fa19cf4e6284\") " Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.735245 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs" (OuterVolumeSpecName: "logs") pod "c1c7c378-78bc-48bd-932c-fa19cf4e6284" (UID: "c1c7c378-78bc-48bd-932c-fa19cf4e6284"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.746603 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf" (OuterVolumeSpecName: "kube-api-access-x4nvf") pod "c1c7c378-78bc-48bd-932c-fa19cf4e6284" (UID: "c1c7c378-78bc-48bd-932c-fa19cf4e6284"). InnerVolumeSpecName "kube-api-access-x4nvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.747338 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg" (OuterVolumeSpecName: "kube-api-access-zwglg") pod "043bc1d7-f57a-481d-b132-71ef45e85480" (UID: "043bc1d7-f57a-481d-b132-71ef45e85480"). InnerVolumeSpecName "kube-api-access-zwglg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.770674 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "043bc1d7-f57a-481d-b132-71ef45e85480" (UID: "043bc1d7-f57a-481d-b132-71ef45e85480"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.776385 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data" (OuterVolumeSpecName: "config-data") pod "043bc1d7-f57a-481d-b132-71ef45e85480" (UID: "043bc1d7-f57a-481d-b132-71ef45e85480"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.783464 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1c7c378-78bc-48bd-932c-fa19cf4e6284" (UID: "c1c7c378-78bc-48bd-932c-fa19cf4e6284"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.784551 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data" (OuterVolumeSpecName: "config-data") pod "c1c7c378-78bc-48bd-932c-fa19cf4e6284" (UID: "c1c7c378-78bc-48bd-932c-fa19cf4e6284"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.837840 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838037 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwglg\" (UniqueName: \"kubernetes.io/projected/043bc1d7-f57a-481d-b132-71ef45e85480-kube-api-access-zwglg\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838127 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/043bc1d7-f57a-481d-b132-71ef45e85480-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838187 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838262 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1c7c378-78bc-48bd-932c-fa19cf4e6284-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838334 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4nvf\" (UniqueName: \"kubernetes.io/projected/c1c7c378-78bc-48bd-932c-fa19cf4e6284-kube-api-access-x4nvf\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.838398 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c7c378-78bc-48bd-932c-fa19cf4e6284-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:12 crc kubenswrapper[5094]: I0220 07:10:12.946138 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.041900 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") pod \"715094df-6704-4332-b990-95d790fd5ff1\" (UID: \"715094df-6704-4332-b990-95d790fd5ff1\") " Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.046313 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s" (OuterVolumeSpecName: "kube-api-access-b7k8s") pod "715094df-6704-4332-b990-95d790fd5ff1" (UID: "715094df-6704-4332-b990-95d790fd5ff1"). InnerVolumeSpecName "kube-api-access-b7k8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.144932 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7k8s\" (UniqueName: \"kubernetes.io/projected/715094df-6704-4332-b990-95d790fd5ff1-kube-api-access-b7k8s\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512056 5094 generic.go:334] "Generic (PLEG): container finished" podID="715094df-6704-4332-b990-95d790fd5ff1" containerID="ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" exitCode=2 Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512181 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512227 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"715094df-6704-4332-b990-95d790fd5ff1","Type":"ContainerDied","Data":"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a"} Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"715094df-6704-4332-b990-95d790fd5ff1","Type":"ContainerDied","Data":"3204759b1dff0695d3d73b3d9357f50ef02ad54eb0c3291abc0e31ecef319da3"} Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512307 5094 scope.go:117] "RemoveContainer" containerID="ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512571 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.512206 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.564893 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.578207 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.588540 5094 scope.go:117] "RemoveContainer" containerID="ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.589581 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a\": container with ID starting with ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a not found: ID does not exist" containerID="ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.589672 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a"} err="failed to get container status \"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a\": rpc error: code = NotFound desc = could not find container \"ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a\": container with ID starting with ffe736a6fc24efeb2e9463249f14ea2fb642857b322f6c26497919b56fb7314a not found: ID does not exist" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.616422 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.630306 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.659253 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.668879 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-log" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.668944 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-log" Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.669036 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715094df-6704-4332-b990-95d790fd5ff1" containerName="kube-state-metrics" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.669049 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="715094df-6704-4332-b990-95d790fd5ff1" containerName="kube-state-metrics" Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.669073 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-metadata" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.669087 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-metadata" Feb 20 07:10:13 crc kubenswrapper[5094]: E0220 07:10:13.669115 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.669127 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.692664 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-metadata" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.692760 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.692794 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="715094df-6704-4332-b990-95d790fd5ff1" containerName="kube-state-metrics" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.692834 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" containerName="nova-metadata-log" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.694262 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.698395 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.702166 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.701672 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.723374 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.739253 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.744864 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.748635 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.748975 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.765341 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.776483 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.786656 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.794542 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.796307 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.802375 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.803137 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.803681 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.852307 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043bc1d7-f57a-481d-b132-71ef45e85480" path="/var/lib/kubelet/pods/043bc1d7-f57a-481d-b132-71ef45e85480/volumes" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.853002 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715094df-6704-4332-b990-95d790fd5ff1" path="/var/lib/kubelet/pods/715094df-6704-4332-b990-95d790fd5ff1/volumes" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.853740 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c7c378-78bc-48bd-932c-fa19cf4e6284" path="/var/lib/kubelet/pods/c1c7c378-78bc-48bd-932c-fa19cf4e6284/volumes" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.890966 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891051 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891085 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891129 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891213 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891284 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891379 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891541 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.891797 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.994502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.994591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.994696 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.994905 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995054 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995099 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995199 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995241 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995316 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995354 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995435 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.996129 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:13 crc kubenswrapper[5094]: I0220 07:10:13.995486 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.002864 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.003575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.004574 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.004837 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.005847 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.017133 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.025546 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.025987 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.033290 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") pod \"nova-metadata-0\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.040800 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.070656 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.099917 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.099978 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.100105 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.100135 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.106395 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.106444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.109377 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.121374 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") pod \"kube-state-metrics-0\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.414268 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.617543 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.692544 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.735337 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.735927 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.739476 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.747172 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.798754 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.799431 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-central-agent" containerID="cri-o://897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" gracePeriod=30 Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.799509 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="proxy-httpd" containerID="cri-o://07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" gracePeriod=30 Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.799606 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="sg-core" containerID="cri-o://40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" gracePeriod=30 Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.799553 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-notification-agent" containerID="cri-o://d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" gracePeriod=30 Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.910865 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:10:14 crc kubenswrapper[5094]: I0220 07:10:14.933808 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.548621 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerStarted","Data":"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.549147 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerStarted","Data":"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.549161 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerStarted","Data":"0d734b120ad598d02afd1f0952c1bd5ccf1f9badb30cba5e61b1f4e9fc055b42"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.555153 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3db6d35c-dfd1-4a59-95d3-cc8a99151c12","Type":"ContainerStarted","Data":"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.555218 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3db6d35c-dfd1-4a59-95d3-cc8a99151c12","Type":"ContainerStarted","Data":"d1d010e8fd8bc9707a8121e22dcd018ce86f613e9bbcc45a6bd9ab2c3e354582"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559844 5094 generic.go:334] "Generic (PLEG): container finished" podID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerID="07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" exitCode=0 Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559879 5094 generic.go:334] "Generic (PLEG): container finished" podID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerID="40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" exitCode=2 Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559887 5094 generic.go:334] "Generic (PLEG): container finished" podID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerID="897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" exitCode=0 Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559969 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.559981 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.566130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fe9db54-4204-4335-a272-c469e0923478","Type":"ContainerStarted","Data":"8a0b13fdbdedc5064e8f68c82ce215006ed4f58e7530fd19fcca453a9915c200"} Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.567301 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.582271 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.601955 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6019214059999998 podStartE2EDuration="2.601921406s" podCreationTimestamp="2026-02-20 07:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:15.572941583 +0000 UTC m=+1430.445568294" watchObservedRunningTime="2026-02-20 07:10:15.601921406 +0000 UTC m=+1430.474548127" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.614495 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.614470367 podStartE2EDuration="2.614470367s" podCreationTimestamp="2026-02-20 07:10:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:15.590735478 +0000 UTC m=+1430.463362209" watchObservedRunningTime="2026-02-20 07:10:15.614470367 +0000 UTC m=+1430.487097088" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.756676 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.760788 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.814790 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944101 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944203 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944257 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944335 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:15 crc kubenswrapper[5094]: I0220 07:10:15.944418 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047006 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047095 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047148 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047191 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047243 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.047270 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048240 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048629 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048628 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.048861 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.071486 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") pod \"dnsmasq-dns-7677694455-llk7m\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.106618 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.591424 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fe9db54-4204-4335-a272-c469e0923478","Type":"ContainerStarted","Data":"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9"} Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.593354 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.596720 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:10:16 crc kubenswrapper[5094]: I0220 07:10:16.611306 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.228997945 podStartE2EDuration="3.61127687s" podCreationTimestamp="2026-02-20 07:10:13 +0000 UTC" firstStartedPulling="2026-02-20 07:10:14.93355958 +0000 UTC m=+1429.806186281" lastFinishedPulling="2026-02-20 07:10:15.315838495 +0000 UTC m=+1430.188465206" observedRunningTime="2026-02-20 07:10:16.608521694 +0000 UTC m=+1431.481148405" watchObservedRunningTime="2026-02-20 07:10:16.61127687 +0000 UTC m=+1431.483903581" Feb 20 07:10:17 crc kubenswrapper[5094]: I0220 07:10:17.601800 5094 generic.go:334] "Generic (PLEG): container finished" podID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerID="a4c4f92b36bb2b7d701dbf8c3f7817a427ce69bddf6bb34e82a6884705e2608c" exitCode=0 Feb 20 07:10:17 crc kubenswrapper[5094]: I0220 07:10:17.601913 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerDied","Data":"a4c4f92b36bb2b7d701dbf8c3f7817a427ce69bddf6bb34e82a6884705e2608c"} Feb 20 07:10:17 crc kubenswrapper[5094]: I0220 07:10:17.602179 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerStarted","Data":"11d132d9afa27137f856e4e3ac63fa1a46eebfeb7ef403dea2957ddcdaf2acba"} Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.483276 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.613847 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerStarted","Data":"fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff"} Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.614039 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" containerID="cri-o://f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" gracePeriod=30 Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.614102 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" containerID="cri-o://18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" gracePeriod=30 Feb 20 07:10:18 crc kubenswrapper[5094]: I0220 07:10:18.661893 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7677694455-llk7m" podStartSLOduration=3.66186123 podStartE2EDuration="3.66186123s" podCreationTimestamp="2026-02-20 07:10:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:18.651485661 +0000 UTC m=+1433.524112372" watchObservedRunningTime="2026-02-20 07:10:18.66186123 +0000 UTC m=+1433.534487971" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.047061 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.070723 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.070772 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.207857 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.332693 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.332887 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.333834 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.333955 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334044 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334116 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334168 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") pod \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\" (UID: \"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8\") " Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334495 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.334833 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.339857 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.347781 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts" (OuterVolumeSpecName: "scripts") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.348302 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v" (OuterVolumeSpecName: "kube-api-access-npk6v") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "kube-api-access-npk6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.382526 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.436862 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npk6v\" (UniqueName: \"kubernetes.io/projected/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-kube-api-access-npk6v\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.437122 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.437234 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.437294 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.471919 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.481204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data" (OuterVolumeSpecName: "config-data") pod "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" (UID: "941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.539628 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.539903 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.632311 5094 generic.go:334] "Generic (PLEG): container finished" podID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerID="f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" exitCode=143 Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.632409 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerDied","Data":"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c"} Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636058 5094 generic.go:334] "Generic (PLEG): container finished" podID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerID="d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" exitCode=0 Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636140 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636184 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5"} Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636232 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8","Type":"ContainerDied","Data":"33dad10331613824fe73aa48d99c70dc5318fba2b2e24e20fa4abae1bee74f21"} Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636256 5094 scope.go:117] "RemoveContainer" containerID="07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.636854 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.663233 5094 scope.go:117] "RemoveContainer" containerID="40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.677182 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.691036 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.706819 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.707660 5094 scope.go:117] "RemoveContainer" containerID="d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.709783 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-notification-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.709804 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-notification-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.709823 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="sg-core" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.709829 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="sg-core" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.709860 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="proxy-httpd" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.709866 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="proxy-httpd" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.710172 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-central-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710184 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-central-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710494 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="proxy-httpd" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710525 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-central-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710532 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="sg-core" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.710547 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" containerName="ceilometer-notification-agent" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.712461 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.715778 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.715974 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.716003 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.728892 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.804508 5094 scope.go:117] "RemoveContainer" containerID="897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.835403 5094 scope.go:117] "RemoveContainer" containerID="07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.836201 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4\": container with ID starting with 07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4 not found: ID does not exist" containerID="07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.836257 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4"} err="failed to get container status \"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4\": rpc error: code = NotFound desc = could not find container \"07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4\": container with ID starting with 07ea59e61fed8a3358551accb40261cf86e2779fdb107f968e111962734d98f4 not found: ID does not exist" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.836294 5094 scope.go:117] "RemoveContainer" containerID="40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.836894 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf\": container with ID starting with 40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf not found: ID does not exist" containerID="40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.836914 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf"} err="failed to get container status \"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf\": rpc error: code = NotFound desc = could not find container \"40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf\": container with ID starting with 40d3a0b02e5c779c99ffae4aa91b14ff147cdd2679151488afdea67f1c016caf not found: ID does not exist" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.836929 5094 scope.go:117] "RemoveContainer" containerID="d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.837463 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5\": container with ID starting with d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5 not found: ID does not exist" containerID="d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.837542 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5"} err="failed to get container status \"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5\": rpc error: code = NotFound desc = could not find container \"d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5\": container with ID starting with d18a7448181106e07e9f83589f66798d53c158f7b49b4cb2f63dd87bb3ea37e5 not found: ID does not exist" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.837620 5094 scope.go:117] "RemoveContainer" containerID="897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" Feb 20 07:10:19 crc kubenswrapper[5094]: E0220 07:10:19.838095 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa\": container with ID starting with 897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa not found: ID does not exist" containerID="897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.838132 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa"} err="failed to get container status \"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa\": rpc error: code = NotFound desc = could not find container \"897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa\": container with ID starting with 897a4151c566a72e1d0e9378be05cebb168cc60b9817042c8a3829b3600f9baa not found: ID does not exist" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.845882 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846111 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846226 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846305 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846558 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846785 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.846838 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.855760 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8" path="/var/lib/kubelet/pods/941dc2db-aaa7-4d80-b2ed-1af8c40d0bd8/volumes" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949359 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949416 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949586 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949728 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949768 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949810 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.949907 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.951865 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.952335 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960042 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960336 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960496 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.960654 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:19 crc kubenswrapper[5094]: I0220 07:10:19.971761 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") pod \"ceilometer-0\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " pod="openstack/ceilometer-0" Feb 20 07:10:20 crc kubenswrapper[5094]: I0220 07:10:20.105047 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:20 crc kubenswrapper[5094]: W0220 07:10:20.643895 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51465a58_4a65_4b58_b7fa_1180b1245e8a.slice/crio-5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674 WatchSource:0}: Error finding container 5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674: Status 404 returned error can't find the container with id 5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674 Feb 20 07:10:20 crc kubenswrapper[5094]: I0220 07:10:20.661424 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:21 crc kubenswrapper[5094]: I0220 07:10:21.187639 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:21 crc kubenswrapper[5094]: I0220 07:10:21.666747 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c"} Feb 20 07:10:21 crc kubenswrapper[5094]: I0220 07:10:21.666866 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674"} Feb 20 07:10:21 crc kubenswrapper[5094]: E0220 07:10:21.996261 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf31fb44b_cd20_42d8_a384_0e4a800ea177.slice/crio-18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf31fb44b_cd20_42d8_a384_0e4a800ea177.slice/crio-conmon-18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b.scope\": RecentStats: unable to find data in memory cache]" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.327655 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.516136 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") pod \"f31fb44b-cd20-42d8-a384-0e4a800ea177\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.516813 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") pod \"f31fb44b-cd20-42d8-a384-0e4a800ea177\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.517539 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs" (OuterVolumeSpecName: "logs") pod "f31fb44b-cd20-42d8-a384-0e4a800ea177" (UID: "f31fb44b-cd20-42d8-a384-0e4a800ea177"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.518418 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") pod \"f31fb44b-cd20-42d8-a384-0e4a800ea177\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.518473 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") pod \"f31fb44b-cd20-42d8-a384-0e4a800ea177\" (UID: \"f31fb44b-cd20-42d8-a384-0e4a800ea177\") " Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.519028 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f31fb44b-cd20-42d8-a384-0e4a800ea177-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.532216 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn" (OuterVolumeSpecName: "kube-api-access-qh6fn") pod "f31fb44b-cd20-42d8-a384-0e4a800ea177" (UID: "f31fb44b-cd20-42d8-a384-0e4a800ea177"). InnerVolumeSpecName "kube-api-access-qh6fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.558574 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f31fb44b-cd20-42d8-a384-0e4a800ea177" (UID: "f31fb44b-cd20-42d8-a384-0e4a800ea177"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.561835 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data" (OuterVolumeSpecName: "config-data") pod "f31fb44b-cd20-42d8-a384-0e4a800ea177" (UID: "f31fb44b-cd20-42d8-a384-0e4a800ea177"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.620540 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.620582 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh6fn\" (UniqueName: \"kubernetes.io/projected/f31fb44b-cd20-42d8-a384-0e4a800ea177-kube-api-access-qh6fn\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.620592 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f31fb44b-cd20-42d8-a384-0e4a800ea177-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.678814 5094 generic.go:334] "Generic (PLEG): container finished" podID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerID="18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" exitCode=0 Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.678876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerDied","Data":"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b"} Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.678914 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.679859 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f31fb44b-cd20-42d8-a384-0e4a800ea177","Type":"ContainerDied","Data":"7169a135be999d7785f50c1ff2763f918e250f0425691f0c798f5fb9269603bf"} Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.679893 5094 scope.go:117] "RemoveContainer" containerID="18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.695898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0"} Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.725603 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.736163 5094 scope.go:117] "RemoveContainer" containerID="f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.744206 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.761012 5094 scope.go:117] "RemoveContainer" containerID="18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" Feb 20 07:10:22 crc kubenswrapper[5094]: E0220 07:10:22.761466 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b\": container with ID starting with 18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b not found: ID does not exist" containerID="18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.761522 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b"} err="failed to get container status \"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b\": rpc error: code = NotFound desc = could not find container \"18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b\": container with ID starting with 18ddd6e3ad547c4944cb5e98b7ae2e945f7c76f0d0a718546312f974a906096b not found: ID does not exist" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.761571 5094 scope.go:117] "RemoveContainer" containerID="f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" Feb 20 07:10:22 crc kubenswrapper[5094]: E0220 07:10:22.762108 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c\": container with ID starting with f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c not found: ID does not exist" containerID="f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.762149 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c"} err="failed to get container status \"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c\": rpc error: code = NotFound desc = could not find container \"f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c\": container with ID starting with f71fec55c80bd4551277892c0bac5ca2613045a77053e7cbf6abbd8434bf844c not found: ID does not exist" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.771881 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:22 crc kubenswrapper[5094]: E0220 07:10:22.772399 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.772420 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" Feb 20 07:10:22 crc kubenswrapper[5094]: E0220 07:10:22.772463 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.772471 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.772665 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-api" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.772731 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" containerName="nova-api-log" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.773964 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.777273 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.777548 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.778580 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.780776 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.925547 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926031 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926078 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926134 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926167 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:22 crc kubenswrapper[5094]: I0220 07:10:22.926188 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028048 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028167 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028220 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028269 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028312 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.028546 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.030759 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.032723 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.033571 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.033579 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.036092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.060104 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") pod \"nova-api-0\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.103721 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.642473 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:23 crc kubenswrapper[5094]: W0220 07:10:23.649917 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee1fa388_c752_45bf_9bd0_25ef5ac0052e.slice/crio-0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8 WatchSource:0}: Error finding container 0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8: Status 404 returned error can't find the container with id 0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8 Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.717048 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b"} Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.720664 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerStarted","Data":"0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8"} Feb 20 07:10:23 crc kubenswrapper[5094]: I0220 07:10:23.881023 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31fb44b-cd20-42d8-a384-0e4a800ea177" path="/var/lib/kubelet/pods/f31fb44b-cd20-42d8-a384-0e4a800ea177/volumes" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.041311 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.071897 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.071940 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.074860 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.439008 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.732860 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerStarted","Data":"3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae"} Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.733247 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerStarted","Data":"8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857"} Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.742905 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-central-agent" containerID="cri-o://01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c" gracePeriod=30 Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746811 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerStarted","Data":"4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0"} Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746873 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746908 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="sg-core" containerID="cri-o://e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b" gracePeriod=30 Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746961 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="proxy-httpd" containerID="cri-o://4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0" gracePeriod=30 Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.746956 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-notification-agent" containerID="cri-o://ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0" gracePeriod=30 Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.761959 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.769630 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.769611936 podStartE2EDuration="2.769611936s" podCreationTimestamp="2026-02-20 07:10:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:24.765773384 +0000 UTC m=+1439.638400095" watchObservedRunningTime="2026-02-20 07:10:24.769611936 +0000 UTC m=+1439.642238647" Feb 20 07:10:24 crc kubenswrapper[5094]: I0220 07:10:24.793863 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.555401157 podStartE2EDuration="5.793840996s" podCreationTimestamp="2026-02-20 07:10:19 +0000 UTC" firstStartedPulling="2026-02-20 07:10:20.647238508 +0000 UTC m=+1435.519865219" lastFinishedPulling="2026-02-20 07:10:23.885678327 +0000 UTC m=+1438.758305058" observedRunningTime="2026-02-20 07:10:24.783044398 +0000 UTC m=+1439.655671109" watchObservedRunningTime="2026-02-20 07:10:24.793840996 +0000 UTC m=+1439.666467707" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.018223 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.019474 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.023854 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.023874 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.031979 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.083100 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.083166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.083214 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.084068 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.085069 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.085276 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.189160 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.189223 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.189265 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.189329 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.199036 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.199067 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.212447 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.212550 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") pod \"nova-cell1-cell-mapping-tp2rg\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.346597 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.756237 5094 generic.go:334] "Generic (PLEG): container finished" podID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerID="4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0" exitCode=0 Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.756695 5094 generic.go:334] "Generic (PLEG): container finished" podID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerID="e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b" exitCode=2 Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.756722 5094 generic.go:334] "Generic (PLEG): container finished" podID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerID="ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0" exitCode=0 Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.757662 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0"} Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.757690 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b"} Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.757700 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0"} Feb 20 07:10:25 crc kubenswrapper[5094]: I0220 07:10:25.826167 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.026859 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.029369 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.061845 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.108683 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.115988 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.116029 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.116129 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.193784 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.194163 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="dnsmasq-dns" containerID="cri-o://6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" gracePeriod=10 Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.218200 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.218259 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.218422 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.219297 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.220003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.243612 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") pod \"certified-operators-kbmdf\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.413609 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.728381 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774279 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerID="6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" exitCode=0 Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774374 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerDied","Data":"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c"} Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774408 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" event={"ID":"5c3bfc89-edf8-4721-a74c-b01a81025919","Type":"ContainerDied","Data":"b6d15829f95b8aff57b91d13f507a8fa1a3e6f6b0bdf9f807b5778fd0588a0ff"} Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774430 5094 scope.go:117] "RemoveContainer" containerID="6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.774610 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75ddbf7c75-smx5j" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.791062 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tp2rg" event={"ID":"85a1c623-233b-4b7e-9a57-e761a5ad27ab","Type":"ContainerStarted","Data":"f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2"} Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.791124 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tp2rg" event={"ID":"85a1c623-233b-4b7e-9a57-e761a5ad27ab","Type":"ContainerStarted","Data":"a97113103dcc5c7ddecb6fa66034d60390c12cce955ecc18177eadd21f50c5ad"} Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.823522 5094 scope.go:117] "RemoveContainer" containerID="f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.833780 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-tp2rg" podStartSLOduration=2.83375479 podStartE2EDuration="2.83375479s" podCreationTimestamp="2026-02-20 07:10:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:26.811878227 +0000 UTC m=+1441.684504938" watchObservedRunningTime="2026-02-20 07:10:26.83375479 +0000 UTC m=+1441.706381501" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836561 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836623 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836721 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836954 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.836984 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.837078 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") pod \"5c3bfc89-edf8-4721-a74c-b01a81025919\" (UID: \"5c3bfc89-edf8-4721-a74c-b01a81025919\") " Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.848285 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd" (OuterVolumeSpecName: "kube-api-access-88nqd") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "kube-api-access-88nqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.855587 5094 scope.go:117] "RemoveContainer" containerID="6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" Feb 20 07:10:26 crc kubenswrapper[5094]: E0220 07:10:26.856017 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c\": container with ID starting with 6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c not found: ID does not exist" containerID="6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.856142 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c"} err="failed to get container status \"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c\": rpc error: code = NotFound desc = could not find container \"6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c\": container with ID starting with 6722282a6c774d888a17178f484ae2330f099371a021c9f8c4fb887feb5e8b4c not found: ID does not exist" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.856229 5094 scope.go:117] "RemoveContainer" containerID="f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e" Feb 20 07:10:26 crc kubenswrapper[5094]: E0220 07:10:26.856527 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e\": container with ID starting with f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e not found: ID does not exist" containerID="f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.856620 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e"} err="failed to get container status \"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e\": rpc error: code = NotFound desc = could not find container \"f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e\": container with ID starting with f1d81ab41f319c40d4f9b647076b0951c152348ace820e5806cf50ad3581a45e not found: ID does not exist" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.953964 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88nqd\" (UniqueName: \"kubernetes.io/projected/5c3bfc89-edf8-4721-a74c-b01a81025919-kube-api-access-88nqd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.956926 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config" (OuterVolumeSpecName: "config") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:26 crc kubenswrapper[5094]: I0220 07:10:26.984731 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:26.999020 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.010341 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.024014 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5c3bfc89-edf8-4721-a74c-b01a81025919" (UID: "5c3bfc89-edf8-4721-a74c-b01a81025919"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093883 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093914 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093924 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093934 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.093943 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5c3bfc89-edf8-4721-a74c-b01a81025919-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.159853 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.166937 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75ddbf7c75-smx5j"] Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.187210 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.806449 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerID="ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632" exitCode=0 Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.806503 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerDied","Data":"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632"} Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.806852 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerStarted","Data":"66d7454047a3aa7a1aca34ae635b12f5cfc407d75ffd9cc4ed6c357443fe8696"} Feb 20 07:10:27 crc kubenswrapper[5094]: I0220 07:10:27.854388 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" path="/var/lib/kubelet/pods/5c3bfc89-edf8-4721-a74c-b01a81025919/volumes" Feb 20 07:10:28 crc kubenswrapper[5094]: I0220 07:10:28.825387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerStarted","Data":"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1"} Feb 20 07:10:28 crc kubenswrapper[5094]: I0220 07:10:28.838804 5094 generic.go:334] "Generic (PLEG): container finished" podID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerID="01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c" exitCode=0 Feb 20 07:10:28 crc kubenswrapper[5094]: I0220 07:10:28.838861 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c"} Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.074817 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144210 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144322 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144430 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144492 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144721 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144783 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144827 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.144901 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") pod \"51465a58-4a65-4b58-b7fa-1180b1245e8a\" (UID: \"51465a58-4a65-4b58-b7fa-1180b1245e8a\") " Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.146200 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.146631 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.157875 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts" (OuterVolumeSpecName: "scripts") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.157880 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt" (OuterVolumeSpecName: "kube-api-access-fmfjt") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "kube-api-access-fmfjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.194201 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.214681 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.239908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.247943 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.247988 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/51465a58-4a65-4b58-b7fa-1180b1245e8a-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248001 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248017 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248028 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248044 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmfjt\" (UniqueName: \"kubernetes.io/projected/51465a58-4a65-4b58-b7fa-1180b1245e8a-kube-api-access-fmfjt\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.248056 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.290815 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data" (OuterVolumeSpecName: "config-data") pod "51465a58-4a65-4b58-b7fa-1180b1245e8a" (UID: "51465a58-4a65-4b58-b7fa-1180b1245e8a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.349929 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51465a58-4a65-4b58-b7fa-1180b1245e8a-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.861024 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerID="606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1" exitCode=0 Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.861099 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerDied","Data":"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1"} Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.887947 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"51465a58-4a65-4b58-b7fa-1180b1245e8a","Type":"ContainerDied","Data":"5e96006a15cd910a0a78cfae59f1575755cc3a3f08785750cbeacf83d1bc1674"} Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.888051 5094 scope.go:117] "RemoveContainer" containerID="4d5fa03cd8cbabd10d5ecc1d66e47cf63ec12bc59870f2b8394b65efb92f55a0" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.888472 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.946965 5094 scope.go:117] "RemoveContainer" containerID="e477f14e15963afda9a601fb4fbac2afcf99b9584a478ab0c3ba3be50b0e631b" Feb 20 07:10:29 crc kubenswrapper[5094]: I0220 07:10:29.990304 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.016384 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.030894 5094 scope.go:117] "RemoveContainer" containerID="ba0a7d74c30a9d41a3210803bed113e6c1055b5c997ed973ff04e9fba9735eb0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.043539 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044246 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="sg-core" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044263 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="sg-core" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044291 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="dnsmasq-dns" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044300 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="dnsmasq-dns" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044316 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-notification-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044324 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-notification-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044337 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="proxy-httpd" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044345 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="proxy-httpd" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044356 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="init" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044363 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="init" Feb 20 07:10:30 crc kubenswrapper[5094]: E0220 07:10:30.044403 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-central-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044412 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-central-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044661 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="sg-core" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.044687 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="proxy-httpd" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.045847 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-central-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.045913 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" containerName="ceilometer-notification-agent" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.045984 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c3bfc89-edf8-4721-a74c-b01a81025919" containerName="dnsmasq-dns" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.049657 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.060616 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.061683 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.061867 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.093746 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.109967 5094 scope.go:117] "RemoveContainer" containerID="01ec1a5b35feb84d8641c01766f84e98d6cb85128044a4d510ffc163fa29f15c" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182048 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182218 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182247 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182272 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182290 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182343 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.182375 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284093 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284146 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284227 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284257 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284294 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.284372 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.292152 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.292359 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.292776 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.293217 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.293016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.294242 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.294427 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.315092 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") pod \"ceilometer-0\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.428615 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.902356 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerStarted","Data":"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4"} Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.942566 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kbmdf" podStartSLOduration=2.470435377 podStartE2EDuration="4.942540463s" podCreationTimestamp="2026-02-20 07:10:26 +0000 UTC" firstStartedPulling="2026-02-20 07:10:27.809648602 +0000 UTC m=+1442.682275333" lastFinishedPulling="2026-02-20 07:10:30.281753708 +0000 UTC m=+1445.154380419" observedRunningTime="2026-02-20 07:10:30.931790895 +0000 UTC m=+1445.804417616" watchObservedRunningTime="2026-02-20 07:10:30.942540463 +0000 UTC m=+1445.815167184" Feb 20 07:10:30 crc kubenswrapper[5094]: I0220 07:10:30.987478 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:10:30 crc kubenswrapper[5094]: W0220 07:10:30.992984 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1218d679_0e51_4bef_9526_db16c8783d8b.slice/crio-98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd WatchSource:0}: Error finding container 98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd: Status 404 returned error can't find the container with id 98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd Feb 20 07:10:31 crc kubenswrapper[5094]: I0220 07:10:31.863471 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51465a58-4a65-4b58-b7fa-1180b1245e8a" path="/var/lib/kubelet/pods/51465a58-4a65-4b58-b7fa-1180b1245e8a/volumes" Feb 20 07:10:31 crc kubenswrapper[5094]: I0220 07:10:31.915022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4"} Feb 20 07:10:31 crc kubenswrapper[5094]: I0220 07:10:31.915080 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd"} Feb 20 07:10:32 crc kubenswrapper[5094]: E0220 07:10:32.292389 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85a1c623_233b_4b7e_9a57_e761a5ad27ab.slice/crio-f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85a1c623_233b_4b7e_9a57_e761a5ad27ab.slice/crio-conmon-f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2.scope\": RecentStats: unable to find data in memory cache]" Feb 20 07:10:32 crc kubenswrapper[5094]: I0220 07:10:32.937261 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a"} Feb 20 07:10:32 crc kubenswrapper[5094]: I0220 07:10:32.937632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c"} Feb 20 07:10:32 crc kubenswrapper[5094]: I0220 07:10:32.942061 5094 generic.go:334] "Generic (PLEG): container finished" podID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" containerID="f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2" exitCode=0 Feb 20 07:10:32 crc kubenswrapper[5094]: I0220 07:10:32.942125 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tp2rg" event={"ID":"85a1c623-233b-4b7e-9a57-e761a5ad27ab","Type":"ContainerDied","Data":"f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2"} Feb 20 07:10:33 crc kubenswrapper[5094]: I0220 07:10:33.105028 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:33 crc kubenswrapper[5094]: I0220 07:10:33.105150 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.080977 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.097394 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.116924 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.117318 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.117670 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.117740 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.117838 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.419233 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.485813 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") pod \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.485929 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") pod \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.486268 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") pod \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.486320 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") pod \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\" (UID: \"85a1c623-233b-4b7e-9a57-e761a5ad27ab\") " Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.514301 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts" (OuterVolumeSpecName: "scripts") pod "85a1c623-233b-4b7e-9a57-e761a5ad27ab" (UID: "85a1c623-233b-4b7e-9a57-e761a5ad27ab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.520054 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw" (OuterVolumeSpecName: "kube-api-access-lz9vw") pod "85a1c623-233b-4b7e-9a57-e761a5ad27ab" (UID: "85a1c623-233b-4b7e-9a57-e761a5ad27ab"). InnerVolumeSpecName "kube-api-access-lz9vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.564212 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data" (OuterVolumeSpecName: "config-data") pod "85a1c623-233b-4b7e-9a57-e761a5ad27ab" (UID: "85a1c623-233b-4b7e-9a57-e761a5ad27ab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.570547 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85a1c623-233b-4b7e-9a57-e761a5ad27ab" (UID: "85a1c623-233b-4b7e-9a57-e761a5ad27ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.590017 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9vw\" (UniqueName: \"kubernetes.io/projected/85a1c623-233b-4b7e-9a57-e761a5ad27ab-kube-api-access-lz9vw\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.590050 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.590060 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.590070 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85a1c623-233b-4b7e-9a57-e761a5ad27ab-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.967094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-tp2rg" event={"ID":"85a1c623-233b-4b7e-9a57-e761a5ad27ab","Type":"ContainerDied","Data":"a97113103dcc5c7ddecb6fa66034d60390c12cce955ecc18177eadd21f50c5ad"} Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.967142 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-tp2rg" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.967180 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a97113103dcc5c7ddecb6fa66034d60390c12cce955ecc18177eadd21f50c5ad" Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.980964 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerStarted","Data":"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92"} Feb 20 07:10:34 crc kubenswrapper[5094]: I0220 07:10:34.996394 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.032536 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.874135313 podStartE2EDuration="6.032506184s" podCreationTimestamp="2026-02-20 07:10:29 +0000 UTC" firstStartedPulling="2026-02-20 07:10:30.996483114 +0000 UTC m=+1445.869109835" lastFinishedPulling="2026-02-20 07:10:34.154853985 +0000 UTC m=+1449.027480706" observedRunningTime="2026-02-20 07:10:35.014434232 +0000 UTC m=+1449.887060943" watchObservedRunningTime="2026-02-20 07:10:35.032506184 +0000 UTC m=+1449.905132915" Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.219236 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.247775 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.248099 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" containerID="cri-o://8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857" gracePeriod=30 Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.248653 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" containerID="cri-o://3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae" gracePeriod=30 Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.256717 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.258043 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerName="nova-scheduler-scheduler" containerID="cri-o://af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f" gracePeriod=30 Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.995110 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerID="8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857" exitCode=143 Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.995233 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerDied","Data":"8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857"} Feb 20 07:10:35 crc kubenswrapper[5094]: I0220 07:10:35.995754 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 07:10:36 crc kubenswrapper[5094]: I0220 07:10:36.414185 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:36 crc kubenswrapper[5094]: I0220 07:10:36.414237 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:36 crc kubenswrapper[5094]: I0220 07:10:36.483885 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:37 crc kubenswrapper[5094]: I0220 07:10:37.007956 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" containerID="cri-o://c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" gracePeriod=30 Feb 20 07:10:37 crc kubenswrapper[5094]: I0220 07:10:37.008044 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" containerID="cri-o://ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" gracePeriod=30 Feb 20 07:10:37 crc kubenswrapper[5094]: I0220 07:10:37.100547 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:37 crc kubenswrapper[5094]: I0220 07:10:37.164122 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:38 crc kubenswrapper[5094]: I0220 07:10:38.028546 5094 generic.go:334] "Generic (PLEG): container finished" podID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerID="ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" exitCode=143 Feb 20 07:10:38 crc kubenswrapper[5094]: I0220 07:10:38.028663 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerDied","Data":"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32"} Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.044685 5094 generic.go:334] "Generic (PLEG): container finished" podID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerID="af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f" exitCode=0 Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.045350 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kbmdf" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="registry-server" containerID="cri-o://5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" gracePeriod=2 Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.044866 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1c68ac6-9f96-4a39-b477-7ad74a04dff9","Type":"ContainerDied","Data":"af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f"} Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.479390 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.517558 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") pod \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.517632 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") pod \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.517773 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") pod \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\" (UID: \"c1c68ac6-9f96-4a39-b477-7ad74a04dff9\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.525172 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn" (OuterVolumeSpecName: "kube-api-access-w8xqn") pod "c1c68ac6-9f96-4a39-b477-7ad74a04dff9" (UID: "c1c68ac6-9f96-4a39-b477-7ad74a04dff9"). InnerVolumeSpecName "kube-api-access-w8xqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.554065 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data" (OuterVolumeSpecName: "config-data") pod "c1c68ac6-9f96-4a39-b477-7ad74a04dff9" (UID: "c1c68ac6-9f96-4a39-b477-7ad74a04dff9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.577441 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1c68ac6-9f96-4a39-b477-7ad74a04dff9" (UID: "c1c68ac6-9f96-4a39-b477-7ad74a04dff9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.581892 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.619627 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") pod \"3b5cbe24-4197-464c-b995-1a1708b551c4\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.619959 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") pod \"3b5cbe24-4197-464c-b995-1a1708b551c4\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.619999 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") pod \"3b5cbe24-4197-464c-b995-1a1708b551c4\" (UID: \"3b5cbe24-4197-464c-b995-1a1708b551c4\") " Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.620443 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8xqn\" (UniqueName: \"kubernetes.io/projected/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-kube-api-access-w8xqn\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.620464 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.620473 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1c68ac6-9f96-4a39-b477-7ad74a04dff9-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.622746 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities" (OuterVolumeSpecName: "utilities") pod "3b5cbe24-4197-464c-b995-1a1708b551c4" (UID: "3b5cbe24-4197-464c-b995-1a1708b551c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.627608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k" (OuterVolumeSpecName: "kube-api-access-q4c5k") pod "3b5cbe24-4197-464c-b995-1a1708b551c4" (UID: "3b5cbe24-4197-464c-b995-1a1708b551c4"). InnerVolumeSpecName "kube-api-access-q4c5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.675959 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b5cbe24-4197-464c-b995-1a1708b551c4" (UID: "3b5cbe24-4197-464c-b995-1a1708b551c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.721634 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4c5k\" (UniqueName: \"kubernetes.io/projected/3b5cbe24-4197-464c-b995-1a1708b551c4-kube-api-access-q4c5k\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.721675 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:39 crc kubenswrapper[5094]: I0220 07:10:39.721685 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b5cbe24-4197-464c-b995-1a1708b551c4-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.115386 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"c1c68ac6-9f96-4a39-b477-7ad74a04dff9","Type":"ContainerDied","Data":"1ceadaa23027d492c09e5b48aebf25f3f67173315364e5177caaee816d00585c"} Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.115772 5094 scope.go:117] "RemoveContainer" containerID="af3a839b78e1bdb84a4e5e4f15b94bc87880557a5061b18ba71bafe2fbc48a2f" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.115568 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.120079 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerID="3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae" exitCode=0 Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.120145 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerDied","Data":"3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae"} Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.122531 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerID="5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" exitCode=0 Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.122558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerDied","Data":"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4"} Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.122573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kbmdf" event={"ID":"3b5cbe24-4197-464c-b995-1a1708b551c4","Type":"ContainerDied","Data":"66d7454047a3aa7a1aca34ae635b12f5cfc407d75ffd9cc4ed6c357443fe8696"} Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.122633 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kbmdf" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.140033 5094 scope.go:117] "RemoveContainer" containerID="5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.178345 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.193891 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:37220->10.217.0.195:8775: read: connection reset by peer" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.194273 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": read tcp 10.217.0.2:37206->10.217.0.195:8775: read: connection reset by peer" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.197225 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kbmdf"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.213953 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.225088 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.241221 5094 scope.go:117] "RemoveContainer" containerID="606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.246261 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248626 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="extract-content" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248648 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="extract-content" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248672 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="registry-server" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248680 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="registry-server" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248717 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerName="nova-scheduler-scheduler" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248725 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerName="nova-scheduler-scheduler" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248741 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" containerName="nova-manage" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248748 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" containerName="nova-manage" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.248771 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="extract-utilities" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.248778 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="extract-utilities" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.249013 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" containerName="nova-scheduler-scheduler" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.249032 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" containerName="registry-server" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.249055 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" containerName="nova-manage" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.249971 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.259123 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.318090 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.357676 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.357748 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.357914 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.416526 5094 scope.go:117] "RemoveContainer" containerID="ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.466583 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.466641 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.466771 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.474447 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.479803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.482401 5094 scope.go:117] "RemoveContainer" containerID="5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.485998 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") pod \"nova-scheduler-0\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.488331 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4\": container with ID starting with 5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4 not found: ID does not exist" containerID="5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.488379 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4"} err="failed to get container status \"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4\": rpc error: code = NotFound desc = could not find container \"5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4\": container with ID starting with 5ab65dc0d699ae8adcd5a321bb71d387f14dd304ffa2faad72488ad6d7cb83d4 not found: ID does not exist" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.488413 5094 scope.go:117] "RemoveContainer" containerID="606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.489140 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1\": container with ID starting with 606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1 not found: ID does not exist" containerID="606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.489173 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1"} err="failed to get container status \"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1\": rpc error: code = NotFound desc = could not find container \"606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1\": container with ID starting with 606cb4ba5ad2ea0198fffd5e046968227d6ed3efb0115787761f2b7c8968dac1 not found: ID does not exist" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.489203 5094 scope.go:117] "RemoveContainer" containerID="ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632" Feb 20 07:10:40 crc kubenswrapper[5094]: E0220 07:10:40.492148 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632\": container with ID starting with ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632 not found: ID does not exist" containerID="ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.492172 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632"} err="failed to get container status \"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632\": rpc error: code = NotFound desc = could not find container \"ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632\": container with ID starting with ff4e8e219345288cfccd7aabbaf0290fadab4ba345665377f23e53ef7cae6632 not found: ID does not exist" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.540928 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.546828 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.650168 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681074 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681135 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681231 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681263 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.681310 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") pod \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\" (UID: \"ee1fa388-c752-45bf-9bd0-25ef5ac0052e\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.686757 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs" (OuterVolumeSpecName: "logs") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.690843 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb" (OuterVolumeSpecName: "kube-api-access-cxhmb") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "kube-api-access-cxhmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.692346 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxhmb\" (UniqueName: \"kubernetes.io/projected/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-kube-api-access-cxhmb\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.692681 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.716603 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.719269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data" (OuterVolumeSpecName: "config-data") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.766930 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.779748 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ee1fa388-c752-45bf-9bd0-25ef5ac0052e" (UID: "ee1fa388-c752-45bf-9bd0-25ef5ac0052e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.793858 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.793971 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.794116 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.794160 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.794309 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") pod \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\" (UID: \"c0eaf3b2-613b-41c2-9eac-ce8093ccec66\") " Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795074 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795115 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795124 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795134 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee1fa388-c752-45bf-9bd0-25ef5ac0052e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.795146 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs" (OuterVolumeSpecName: "logs") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.797982 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d" (OuterVolumeSpecName: "kube-api-access-cgt5d") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "kube-api-access-cgt5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.824386 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data" (OuterVolumeSpecName: "config-data") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.824864 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.849694 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c0eaf3b2-613b-41c2-9eac-ce8093ccec66" (UID: "c0eaf3b2-613b-41c2-9eac-ce8093ccec66"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897505 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgt5d\" (UniqueName: \"kubernetes.io/projected/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-kube-api-access-cgt5d\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897543 5094 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897552 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897576 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:40 crc kubenswrapper[5094]: I0220 07:10:40.897589 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0eaf3b2-613b-41c2-9eac-ce8093ccec66-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.075269 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.133466 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee1fa388-c752-45bf-9bd0-25ef5ac0052e","Type":"ContainerDied","Data":"0f5bc2605cd33284cc1bf9e4b5234398fc32ff7c0f92f9117b0968f05d2000c8"} Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.133998 5094 scope.go:117] "RemoveContainer" containerID="3d3145d1a90086f1227a72125422cd468bd73a49677f9611c5a511d3d24411ae" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.133542 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.138690 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1504790-ccaf-42d5-a28a-a25f0cd353c9","Type":"ContainerStarted","Data":"15996ec151f9ac116c6912aa4e992bb9af3fc72485808d76d5e14b93da12f57f"} Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.140922 5094 generic.go:334] "Generic (PLEG): container finished" podID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerID="c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" exitCode=0 Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.140965 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerDied","Data":"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d"} Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.140979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c0eaf3b2-613b-41c2-9eac-ce8093ccec66","Type":"ContainerDied","Data":"0d734b120ad598d02afd1f0952c1bd5ccf1f9badb30cba5e61b1f4e9fc055b42"} Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.141038 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.225232 5094 scope.go:117] "RemoveContainer" containerID="8455819637c85b383fa39a66aae025d6e59a87cd68c5fdcccaf581b0cceed857" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.229689 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.263441 5094 scope.go:117] "RemoveContainer" containerID="c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.282384 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.300537 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.301535 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.301626 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.301721 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.301779 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.301858 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.301916 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.302006 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302312 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302690 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-metadata" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302861 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-api" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302931 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" containerName="nova-metadata-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.302990 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" containerName="nova-api-log" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.304245 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.308867 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.310230 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.318940 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.341490 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.344996 5094 scope.go:117] "RemoveContainer" containerID="ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.351681 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.362731 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.366478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.369535 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.369928 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.369967 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.371086 5094 scope.go:117] "RemoveContainer" containerID="c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.372514 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.374927 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d\": container with ID starting with c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d not found: ID does not exist" containerID="c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.374977 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d"} err="failed to get container status \"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d\": rpc error: code = NotFound desc = could not find container \"c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d\": container with ID starting with c7a53317b54066a403baba090862c5995bba55506254ad03280d9c1ace19b11d not found: ID does not exist" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.375013 5094 scope.go:117] "RemoveContainer" containerID="ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" Feb 20 07:10:41 crc kubenswrapper[5094]: E0220 07:10:41.375550 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32\": container with ID starting with ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32 not found: ID does not exist" containerID="ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.375584 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32"} err="failed to get container status \"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32\": rpc error: code = NotFound desc = could not find container \"ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32\": container with ID starting with ae97ccd59962f48b4e76117ad4ab2034a3bfad9a1ff6a242afd02e6a32cafe32 not found: ID does not exist" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415695 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415752 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415807 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415867 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.415885 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416406 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416696 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.416813 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.520106 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.520741 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.521049 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.521454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.523489 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.525011 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.525219 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.526610 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.526799 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.527087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.527236 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.524696 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.528428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.528690 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.529868 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.531350 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.532802 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.533470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.538424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.538781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.539190 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") pod \"nova-api-0\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.547503 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") pod \"nova-metadata-0\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.639519 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.692308 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.858319 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5cbe24-4197-464c-b995-1a1708b551c4" path="/var/lib/kubelet/pods/3b5cbe24-4197-464c-b995-1a1708b551c4/volumes" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.859331 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0eaf3b2-613b-41c2-9eac-ce8093ccec66" path="/var/lib/kubelet/pods/c0eaf3b2-613b-41c2-9eac-ce8093ccec66/volumes" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.859907 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1c68ac6-9f96-4a39-b477-7ad74a04dff9" path="/var/lib/kubelet/pods/c1c68ac6-9f96-4a39-b477-7ad74a04dff9/volumes" Feb 20 07:10:41 crc kubenswrapper[5094]: I0220 07:10:41.862164 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1fa388-c752-45bf-9bd0-25ef5ac0052e" path="/var/lib/kubelet/pods/ee1fa388-c752-45bf-9bd0-25ef5ac0052e/volumes" Feb 20 07:10:42 crc kubenswrapper[5094]: I0220 07:10:42.163406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1504790-ccaf-42d5-a28a-a25f0cd353c9","Type":"ContainerStarted","Data":"e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e"} Feb 20 07:10:42 crc kubenswrapper[5094]: I0220 07:10:42.187155 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.187132942 podStartE2EDuration="2.187132942s" podCreationTimestamp="2026-02-20 07:10:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:42.183080825 +0000 UTC m=+1457.055707546" watchObservedRunningTime="2026-02-20 07:10:42.187132942 +0000 UTC m=+1457.059759653" Feb 20 07:10:42 crc kubenswrapper[5094]: I0220 07:10:42.207283 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:10:42 crc kubenswrapper[5094]: W0220 07:10:42.212857 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf11aa87b_3964_4a62_871f_bdf7d1ad7848.slice/crio-cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda WatchSource:0}: Error finding container cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda: Status 404 returned error can't find the container with id cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda Feb 20 07:10:42 crc kubenswrapper[5094]: I0220 07:10:42.289469 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:10:42 crc kubenswrapper[5094]: W0220 07:10:42.298540 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca3adffb_7baf_45db_ab16_cc1c63510fec.slice/crio-1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c WatchSource:0}: Error finding container 1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c: Status 404 returned error can't find the container with id 1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.212451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerStarted","Data":"50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.213007 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerStarted","Data":"229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.213063 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerStarted","Data":"cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.215825 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerStarted","Data":"92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.215893 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerStarted","Data":"c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.215911 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerStarted","Data":"1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c"} Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.250656 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.250625273 podStartE2EDuration="2.250625273s" podCreationTimestamp="2026-02-20 07:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:43.238502412 +0000 UTC m=+1458.111129153" watchObservedRunningTime="2026-02-20 07:10:43.250625273 +0000 UTC m=+1458.123251994" Feb 20 07:10:43 crc kubenswrapper[5094]: I0220 07:10:43.277405 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.277381203 podStartE2EDuration="2.277381203s" podCreationTimestamp="2026-02-20 07:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:10:43.27146057 +0000 UTC m=+1458.144087301" watchObservedRunningTime="2026-02-20 07:10:43.277381203 +0000 UTC m=+1458.150007914" Feb 20 07:10:45 crc kubenswrapper[5094]: I0220 07:10:45.541373 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 07:10:46 crc kubenswrapper[5094]: I0220 07:10:46.639733 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:10:46 crc kubenswrapper[5094]: I0220 07:10:46.639807 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 07:10:50 crc kubenswrapper[5094]: I0220 07:10:50.542267 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 07:10:50 crc kubenswrapper[5094]: I0220 07:10:50.594624 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.388603 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.639894 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.639989 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.693136 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:51 crc kubenswrapper[5094]: I0220 07:10:51.693371 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 07:10:52 crc kubenswrapper[5094]: I0220 07:10:52.688061 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:52 crc kubenswrapper[5094]: I0220 07:10:52.688100 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:52 crc kubenswrapper[5094]: I0220 07:10:52.707896 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:10:52 crc kubenswrapper[5094]: I0220 07:10:52.707891 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 20 07:11:00 crc kubenswrapper[5094]: I0220 07:11:00.442681 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.648049 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.648895 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.655643 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.708281 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.710027 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.710179 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 07:11:01 crc kubenswrapper[5094]: I0220 07:11:01.716664 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 07:11:02 crc kubenswrapper[5094]: I0220 07:11:02.479888 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 07:11:02 crc kubenswrapper[5094]: I0220 07:11:02.490824 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 07:11:02 crc kubenswrapper[5094]: I0220 07:11:02.509832 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.106794 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.107311 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.107370 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.108062 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.108132 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" gracePeriod=600 Feb 20 07:11:04 crc kubenswrapper[5094]: E0220 07:11:04.238793 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.509821 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" exitCode=0 Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.509955 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310"} Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.510390 5094 scope.go:117] "RemoveContainer" containerID="b476ab5b54b82460e3be1d827bfff187825879d93c9cc19cc5170f59943c3ef7" Feb 20 07:11:04 crc kubenswrapper[5094]: I0220 07:11:04.512358 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:04 crc kubenswrapper[5094]: E0220 07:11:04.512979 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:17 crc kubenswrapper[5094]: I0220 07:11:17.841090 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:17 crc kubenswrapper[5094]: E0220 07:11:17.842441 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.441806 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.443077 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerName="openstackclient" containerID="cri-o://1db2a566904b3dc511755f2dbe6958ef04fbbaf45c4039fb8bb4f2d0b8ee27fe" gracePeriod=2 Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.504491 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.705950 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.747079 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:22 crc kubenswrapper[5094]: E0220 07:11:22.747744 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerName="openstackclient" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.747766 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerName="openstackclient" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.748003 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerName="openstackclient" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.748820 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.753208 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.798811 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.799300 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: E0220 07:11:22.799996 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:22 crc kubenswrapper[5094]: E0220 07:11:22.800098 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:23.3000593 +0000 UTC m=+1498.172686011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.830600 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.875819 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.901196 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.901545 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.902613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.925660 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.926305 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" containerID="cri-o://3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" gracePeriod=30 Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.926970 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="openstack-network-exporter" containerID="cri-o://aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb" gracePeriod=30 Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.953894 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d63e-account-create-update-hkz6h"] Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.963250 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") pod \"nova-api-d63e-account-create-update-tvf55\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:22 crc kubenswrapper[5094]: I0220 07:11:22.997112 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.031787 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1fa0-account-create-update-8g9zk"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.057569 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.063058 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.072819 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.088160 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.102943 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.110953 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.112745 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.112909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.137096 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.175845 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-b77jp"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.206366 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.222725 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.222838 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.224098 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.262173 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.328101 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.328188 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:23.828156618 +0000 UTC m=+1498.700783329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.328289 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.328409 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:24.328373984 +0000 UTC m=+1499.201000685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.334792 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jsvf2"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.355626 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") pod \"glance-1fa0-account-create-update-zvvj2\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.406829 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.408310 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.417151 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.487734 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.534791 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.535200 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-gg2h9" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerName="openstack-network-exporter" containerID="cri-o://ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310" gracePeriod=30 Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.536325 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.536522 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.550821 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.573162 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.593027 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8083-account-create-update-wxrzd"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.608355 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.623476 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.640061 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.640164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.641377 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.656409 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-abcd-account-create-update-bwsmr"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.689723 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") pod \"root-account-create-update-5vc6z\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.689846 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.737883 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.759426 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-d27ft"] Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.771743 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.860028 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-675a-account-create-update-7f6v9"] Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.866053 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: E0220 07:11:23.866130 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:24.866106042 +0000 UTC m=+1499.738732753 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:23 crc kubenswrapper[5094]: I0220 07:11:23.912383 5094 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/swift-proxy-6964856c75-f7xdp" secret="" err="secret \"swift-swift-dockercfg-5btkp\" not found" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.009489 5094 generic.go:334] "Generic (PLEG): container finished" podID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerID="aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb" exitCode=2 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.105860 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0626209e-1ab6-4bd1-a5cc-35a2f6525e5b" path="/var/lib/kubelet/pods/0626209e-1ab6-4bd1-a5cc-35a2f6525e5b/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.106549 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23a44809-2f91-4dbe-80ed-733390b037d8" path="/var/lib/kubelet/pods/23a44809-2f91-4dbe-80ed-733390b037d8/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.107143 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2538e2cc-781b-4c2a-b993-381e488fd5bb" path="/var/lib/kubelet/pods/2538e2cc-781b-4c2a-b993-381e488fd5bb/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.107998 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5044f3da-a9aa-4f6e-b598-3b5e963f8731" path="/var/lib/kubelet/pods/5044f3da-a9aa-4f6e-b598-3b5e963f8731/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.112522 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50bf9176-b504-436f-a845-7ab55506a258" path="/var/lib/kubelet/pods/50bf9176-b504-436f-a845-7ab55506a258/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.125998 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="772e2155-8d29-40de-8aff-5e42112e6171" path="/var/lib/kubelet/pods/772e2155-8d29-40de-8aff-5e42112e6171/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.127064 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4920eee-8485-4faa-892c-893c6466a90c" path="/var/lib/kubelet/pods/c4920eee-8485-4faa-892c-893c6466a90c/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.128477 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd4e0644-4339-45bf-a919-0de0551c5baa" path="/var/lib/kubelet/pods/fd4e0644-4339-45bf-a919-0de0551c5baa/volumes" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.128946 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gg2h9_ca32118b-2e77-4484-b753-3467e1ba8df1/openstack-network-exporter/0.log" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.129000 5094 generic.go:334] "Generic (PLEG): container finished" podID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerID="ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310" exitCode=2 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.129243 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerDied","Data":"aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb"} Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.129287 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.129308 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gg2h9" event={"ID":"ca32118b-2e77-4484-b753-3467e1ba8df1","Type":"ContainerDied","Data":"ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310"} Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.131634 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fvmwf"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.163122 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.202531 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-t7hr7"] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.208625 5094 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.208662 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.208682 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6964856c75-f7xdp: [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.208752 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift podName:8e30bbcd-c206-4a74-ae52-21462356babf nodeName:}" failed. No retries permitted until 2026-02-20 07:11:24.708728647 +0000 UTC m=+1499.581355358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift") pod "swift-proxy-6964856c75-f7xdp" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf") : [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.270124 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 07:11:24 crc kubenswrapper[5094]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: if [ -n "nova_api" ]; then Feb 20 07:11:24 crc kubenswrapper[5094]: GRANT_DATABASE="nova_api" Feb 20 07:11:24 crc kubenswrapper[5094]: else Feb 20 07:11:24 crc kubenswrapper[5094]: GRANT_DATABASE="*" Feb 20 07:11:24 crc kubenswrapper[5094]: fi Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: # going for maximum compatibility here: Feb 20 07:11:24 crc kubenswrapper[5094]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 07:11:24 crc kubenswrapper[5094]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 07:11:24 crc kubenswrapper[5094]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 07:11:24 crc kubenswrapper[5094]: # support updates Feb 20 07:11:24 crc kubenswrapper[5094]: Feb 20 07:11:24 crc kubenswrapper[5094]: $MYSQL_CMD < logger="UnhandledError" Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.271763 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-d63e-account-create-update-tvf55" podUID="2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.286296 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.330053 5094 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-lvlr2" message="Exiting ovn-controller (1) " Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.330098 5094 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" containerID="cri-o://4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.330140 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-lvlr2" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" containerID="cri-o://4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.351323 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8a74-account-create-update-cdhcw"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.425620 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.426000 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5b8f9d577d-pgn2k" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-log" containerID="cri-o://d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.426329 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5b8f9d577d-pgn2k" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-api" containerID="cri-o://91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.431978 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.432133 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:26.432099557 +0000 UTC m=+1501.304726278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.468721 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.480517 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-dnm22"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.548779 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.583909 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.632522 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.633355 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="openstack-network-exporter" containerID="cri-o://2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47" gracePeriod=300 Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.664840 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca32118b_2e77_4484_b753_3467e1ba8df1.slice/crio-ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca32118b_2e77_4484_b753_3467e1ba8df1.slice/crio-conmon-ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ecb5d91_5ba1_457e_af42_0d78c8643250.slice/crio-conmon-4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ecb5d91_5ba1_457e_af42_0d78c8643250.slice/crio-4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301.scope\": RecentStats: unable to find data in memory cache]" Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.680947 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.682082 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="openstack-network-exporter" containerID="cri-o://d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976" gracePeriod=300 Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.734960 5094 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.734998 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.735011 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6964856c75-f7xdp: [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.735072 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift podName:8e30bbcd-c206-4a74-ae52-21462356babf nodeName:}" failed. No retries permitted until 2026-02-20 07:11:25.735050252 +0000 UTC m=+1500.607676953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift") pod "swift-proxy-6964856c75-f7xdp" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf") : [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.740037 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-tp2rg"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.759901 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="ovsdbserver-sb" containerID="cri-o://333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68" gracePeriod=300 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.778876 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-v6v6k"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.785549 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.785946 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7677694455-llk7m" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" containerID="cri-o://fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff" gracePeriod=10 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.792982 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793431 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-server" containerID="cri-o://77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793724 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="ovsdbserver-nb" containerID="cri-o://87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d" gracePeriod=300 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793812 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="swift-recon-cron" containerID="cri-o://694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793864 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="rsync" containerID="cri-o://d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793902 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-expirer" containerID="cri-o://ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.793960 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-auditor" containerID="cri-o://da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794020 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-updater" containerID="cri-o://997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794074 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-auditor" containerID="cri-o://c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794135 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-replicator" containerID="cri-o://b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794192 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-server" containerID="cri-o://6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794230 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-updater" containerID="cri-o://ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794277 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-reaper" containerID="cri-o://876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794315 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-replicator" containerID="cri-o://798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794364 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-server" containerID="cri-o://13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794406 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-replicator" containerID="cri-o://4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.794452 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-auditor" containerID="cri-o://87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.800329 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.804095 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-vpv24"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.816864 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.817193 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" containerID="cri-o://0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.817922 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" containerID="cri-o://5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.820641 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.821964 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="cinder-scheduler" containerID="cri-o://2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.822170 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="probe" containerID="cri-o://0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.868769 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.881770 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.882056 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54bd68f77-fkqmr" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-api" containerID="cri-o://2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.882458 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54bd68f77-fkqmr" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" containerID="cri-o://5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.898227 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.904497 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" containerID="cri-o://fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.904931 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" containerID="cri-o://d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.937129 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.938953 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.942161 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:24 crc kubenswrapper[5094]: E0220 07:11:24.942215 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:26.942198393 +0000 UTC m=+1501.814825104 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.942450 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" containerID="cri-o://229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.942767 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" containerID="cri-o://50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d" gracePeriod=30 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.945874 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" containerID="cri-o://381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" gracePeriod=29 Feb 20 07:11:24 crc kubenswrapper[5094]: I0220 07:11:24.966318 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="rabbitmq" containerID="cri-o://74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6" gracePeriod=604800 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.015841 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-7gn4d"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.299934 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gg2h9_ca32118b-2e77-4484-b753-3467e1ba8df1/openstack-network-exporter/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.300438 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.302077 5094 generic.go:334] "Generic (PLEG): container finished" podID="762a565c-672e-4127-a8c6-90f721eeda81" containerID="fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.302160 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerDied","Data":"fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.319291 5094 generic.go:334] "Generic (PLEG): container finished" podID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerID="d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.319486 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.319536 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerDied","Data":"d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.320023 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7df9984bd9-6txsf" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker-log" containerID="cri-o://7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.320166 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7df9984bd9-6txsf" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker" containerID="cri-o://d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.332640 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337712 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d3ec8857-5a33-44ea-bdd0-97b343adfc8a/ovsdbserver-nb/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337756 5094 generic.go:334] "Generic (PLEG): container finished" podID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerID="d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976" exitCode=2 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337776 5094 generic.go:334] "Generic (PLEG): container finished" podID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerID="87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337855 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerDied","Data":"d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.337884 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerDied","Data":"87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.338853 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.340025 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-tvf55" event={"ID":"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7","Type":"ContainerStarted","Data":"9e2e1470fd33e88144567dad9b332e30ed6e81a2d129cafee24b4fca5bfc7939"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.365988 5094 generic.go:334] "Generic (PLEG): container finished" podID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerID="229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.366092 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerDied","Data":"229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d"} Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.366434 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 07:11:25 crc kubenswrapper[5094]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: if [ -n "nova_api" ]; then Feb 20 07:11:25 crc kubenswrapper[5094]: GRANT_DATABASE="nova_api" Feb 20 07:11:25 crc kubenswrapper[5094]: else Feb 20 07:11:25 crc kubenswrapper[5094]: GRANT_DATABASE="*" Feb 20 07:11:25 crc kubenswrapper[5094]: fi Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: # going for maximum compatibility here: Feb 20 07:11:25 crc kubenswrapper[5094]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 07:11:25 crc kubenswrapper[5094]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 07:11:25 crc kubenswrapper[5094]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 07:11:25 crc kubenswrapper[5094]: # support updates Feb 20 07:11:25 crc kubenswrapper[5094]: Feb 20 07:11:25 crc kubenswrapper[5094]: $MYSQL_CMD < logger="UnhandledError" Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.367695 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-d63e-account-create-update-tvf55" podUID="2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.405338 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerID="4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.405459 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-lvlr2" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.405498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-lvlr2" event={"ID":"8ecb5d91-5ba1-457e-af42-0d78c8643250","Type":"ContainerDied","Data":"4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.406692 5094 scope.go:117] "RemoveContainer" containerID="4f0a991715f978e32d8d86416765ee44cb6bb02db7b4482e7725dce37e0de301" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.433763 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.434044 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api-log" containerID="cri-o://e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.434141 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" containerID="cri-o://d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.438978 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") pod \"f8ca33ba-f76e-4352-b6f1-54588dd25285\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439041 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") pod \"f8ca33ba-f76e-4352-b6f1-54588dd25285\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439099 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439122 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439169 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439224 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439287 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439326 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439387 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439448 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439494 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.439594 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") pod \"8ecb5d91-5ba1-457e-af42-0d78c8643250\" (UID: \"8ecb5d91-5ba1-457e-af42-0d78c8643250\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.447902 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run" (OuterVolumeSpecName: "var-run") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.451799 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config" (OuterVolumeSpecName: "config") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.455390 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts" (OuterVolumeSpecName: "scripts") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457340 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cadd011d-8dde-4346-8608-c5f74376204d/ovsdbserver-sb/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457387 5094 generic.go:334] "Generic (PLEG): container finished" podID="cadd011d-8dde-4346-8608-c5f74376204d" containerID="2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47" exitCode=2 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457407 5094 generic.go:334] "Generic (PLEG): container finished" podID="cadd011d-8dde-4346-8608-c5f74376204d" containerID="333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457499 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerDied","Data":"2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.457531 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerDied","Data":"333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.458021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.458501 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.458542 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.458686 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.465411 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") pod \"f8ca33ba-f76e-4352-b6f1-54588dd25285\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.465543 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.465569 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") pod \"f8ca33ba-f76e-4352-b6f1-54588dd25285\" (UID: \"f8ca33ba-f76e-4352-b6f1-54588dd25285\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.465647 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") pod \"ca32118b-2e77-4484-b753-3467e1ba8df1\" (UID: \"ca32118b-2e77-4484-b753-3467e1ba8df1\") " Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466681 5094 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovs-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466717 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ca32118b-2e77-4484-b753-3467e1ba8df1-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466727 5094 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466737 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ecb5d91-5ba1-457e-af42-0d78c8643250-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466748 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca32118b-2e77-4484-b753-3467e1ba8df1-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466756 5094 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.466765 5094 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/8ecb5d91-5ba1-457e-af42-0d78c8643250-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.467843 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.486909 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-54bd68f77-fkqmr" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.165:9696/\": read tcp 10.217.0.2:45196->10.217.0.165:9696: read: connection reset by peer" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.488131 5094 generic.go:334] "Generic (PLEG): container finished" podID="f8ca33ba-f76e-4352-b6f1-54588dd25285" containerID="1db2a566904b3dc511755f2dbe6958ef04fbbaf45c4039fb8bb4f2d0b8ee27fe" exitCode=137 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.488387 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.511157 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.519092 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q" (OuterVolumeSpecName: "kube-api-access-hjn6q") pod "f8ca33ba-f76e-4352-b6f1-54588dd25285" (UID: "f8ca33ba-f76e-4352-b6f1-54588dd25285"). InnerVolumeSpecName "kube-api-access-hjn6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.519248 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf" (OuterVolumeSpecName: "kube-api-access-s5tkf") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "kube-api-access-s5tkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.547546 5094 scope.go:117] "RemoveContainer" containerID="1db2a566904b3dc511755f2dbe6958ef04fbbaf45c4039fb8bb4f2d0b8ee27fe" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.548263 5094 generic.go:334] "Generic (PLEG): container finished" podID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerID="0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c" exitCode=143 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.548342 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerDied","Data":"0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.577688 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5tkf\" (UniqueName: \"kubernetes.io/projected/8ecb5d91-5ba1-457e-af42-0d78c8643250-kube-api-access-s5tkf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.577733 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjn6q\" (UniqueName: \"kubernetes.io/projected/f8ca33ba-f76e-4352-b6f1-54588dd25285-kube-api-access-hjn6q\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.585143 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-qqgpn"] Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.587866 5094 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Feb 20 07:11:25 crc kubenswrapper[5094]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 07:11:25 crc kubenswrapper[5094]: + source /usr/local/bin/container-scripts/functions Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNBridge=br-int Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNRemote=tcp:localhost:6642 Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNEncapType=geneve Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNAvailabilityZones= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ EnableChassisAsGateway=true Feb 20 07:11:25 crc kubenswrapper[5094]: ++ PhysicalNetworks= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNHostName= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 07:11:25 crc kubenswrapper[5094]: ++ ovs_dir=/var/lib/openvswitch Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 07:11:25 crc kubenswrapper[5094]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + cleanup_ovsdb_server_semaphore Feb 20 07:11:25 crc kubenswrapper[5094]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 07:11:25 crc kubenswrapper[5094]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-tj42x" message=< Feb 20 07:11:25 crc kubenswrapper[5094]: Exiting ovsdb-server (5) [ OK ] Feb 20 07:11:25 crc kubenswrapper[5094]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 07:11:25 crc kubenswrapper[5094]: + source /usr/local/bin/container-scripts/functions Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNBridge=br-int Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNRemote=tcp:localhost:6642 Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNEncapType=geneve Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNAvailabilityZones= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ EnableChassisAsGateway=true Feb 20 07:11:25 crc kubenswrapper[5094]: ++ PhysicalNetworks= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNHostName= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 07:11:25 crc kubenswrapper[5094]: ++ ovs_dir=/var/lib/openvswitch Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 07:11:25 crc kubenswrapper[5094]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + cleanup_ovsdb_server_semaphore Feb 20 07:11:25 crc kubenswrapper[5094]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 07:11:25 crc kubenswrapper[5094]: > Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.587916 5094 kuberuntime_container.go:691] "PreStop hook failed" err=< Feb 20 07:11:25 crc kubenswrapper[5094]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Feb 20 07:11:25 crc kubenswrapper[5094]: + source /usr/local/bin/container-scripts/functions Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNBridge=br-int Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNRemote=tcp:localhost:6642 Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNEncapType=geneve Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNAvailabilityZones= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ EnableChassisAsGateway=true Feb 20 07:11:25 crc kubenswrapper[5094]: ++ PhysicalNetworks= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ OVNHostName= Feb 20 07:11:25 crc kubenswrapper[5094]: ++ DB_FILE=/etc/openvswitch/conf.db Feb 20 07:11:25 crc kubenswrapper[5094]: ++ ovs_dir=/var/lib/openvswitch Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Feb 20 07:11:25 crc kubenswrapper[5094]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Feb 20 07:11:25 crc kubenswrapper[5094]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + sleep 0.5 Feb 20 07:11:25 crc kubenswrapper[5094]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Feb 20 07:11:25 crc kubenswrapper[5094]: + cleanup_ovsdb_server_semaphore Feb 20 07:11:25 crc kubenswrapper[5094]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Feb 20 07:11:25 crc kubenswrapper[5094]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Feb 20 07:11:25 crc kubenswrapper[5094]: > pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" containerID="cri-o://ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.587957 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" containerID="cri-o://ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" gracePeriod=28 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.599571 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.599953 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" containerID="cri-o://c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.600128 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" containerID="cri-o://92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.606750 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.612057 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.612378 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" containerID="cri-o://459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.612557 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" containerID="cri-o://a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.617468 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-hlntr"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624225 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624361 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624420 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624495 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624548 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624610 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624666 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624741 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624809 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.624894 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625135 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625228 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625290 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625557 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625735 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625812 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625880 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.625940 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.626002 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.626074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.627983 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.628242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f" (OuterVolumeSpecName: "kube-api-access-n6d2f") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "kube-api-access-n6d2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.628276 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener-log" containerID="cri-o://c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.628445 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener" containerID="cri-o://b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.666560 5094 generic.go:334] "Generic (PLEG): container finished" podID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerID="fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff" exitCode=0 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.666714 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerDied","Data":"fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.670488 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.670793 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" containerID="cri-o://e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.679100 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-gg2h9_ca32118b-2e77-4484-b753-3467e1ba8df1/openstack-network-exporter/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.679188 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-gg2h9" event={"ID":"ca32118b-2e77-4484-b753-3467e1ba8df1","Type":"ContainerDied","Data":"8220b8fae0a21f561557c1539a6ab409db96f4d4c24a493c8737608b37dc4bc1"} Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.679300 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-gg2h9" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.679890 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6d2f\" (UniqueName: \"kubernetes.io/projected/ca32118b-2e77-4484-b753-3467e1ba8df1-kube-api-access-n6d2f\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.695841 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.724648 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-z4m42"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.746228 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f8ca33ba-f76e-4352-b6f1-54588dd25285" (UID: "f8ca33ba-f76e-4352-b6f1-54588dd25285"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.761324 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.782592 5094 scope.go:117] "RemoveContainer" containerID="ca9c2e6bc34587959334740a47a8ea31e6c558cced5427d8002996ae38da9310" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.785494 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f8ca33ba-f76e-4352-b6f1-54588dd25285" (UID: "f8ca33ba-f76e-4352-b6f1-54588dd25285"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.793636 5094 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.793688 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.793715 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6964856c75-f7xdp: [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.793806 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift podName:8e30bbcd-c206-4a74-ae52-21462356babf nodeName:}" failed. No retries permitted until 2026-02-20 07:11:27.793777547 +0000 UTC m=+1502.666404258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift") pod "swift-proxy-6964856c75-f7xdp" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf") : [secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.799268 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.802856 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.810895 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.832692 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.833671 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: E0220 07:11:25.847414 5094 info.go:109] Failed to get network devices: open /sys/class/net/11d132d9afa2713/address: no such file or directory Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.883432 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15583b83-ce22-4b0b-9566-0e056b07c0d7" path="/var/lib/kubelet/pods/15583b83-ce22-4b0b-9566-0e056b07c0d7/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.884498 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317d32d8-9ad2-4bd1-87f4-745e3157c713" path="/var/lib/kubelet/pods/317d32d8-9ad2-4bd1-87f4-745e3157c713/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.885081 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d59abb8-e7c7-404f-8f03-13d2167bea54" path="/var/lib/kubelet/pods/3d59abb8-e7c7-404f-8f03-13d2167bea54/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.904263 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61f66271-5ce9-4412-8ea3-9a63a934f307" path="/var/lib/kubelet/pods/61f66271-5ce9-4412-8ea3-9a63a934f307/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.908811 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fb81f20-1f88-4c11-a37a-31db4472afd2" path="/var/lib/kubelet/pods/7fb81f20-1f88-4c11-a37a-31db4472afd2/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.909315 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a1c623-233b-4b7e-9a57-e761a5ad27ab" path="/var/lib/kubelet/pods/85a1c623-233b-4b7e-9a57-e761a5ad27ab/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.909860 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87399a2-42e4-4f46-b93c-cd4f25594a16" path="/var/lib/kubelet/pods/a87399a2-42e4-4f46-b93c-cd4f25594a16/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.917250 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c32c030e-7f51-4f5e-a4bb-c27288d8f2e9" path="/var/lib/kubelet/pods/c32c030e-7f51-4f5e-a4bb-c27288d8f2e9/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.917803 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e6aec3-87a9-4f8a-b640-313ab241ec6f" path="/var/lib/kubelet/pods/d6e6aec3-87a9-4f8a-b640-313ab241ec6f/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.918443 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc4926a-ede6-4124-ac91-c9912ffa8a23" path="/var/lib/kubelet/pods/ffc4926a-ede6-4124-ac91-c9912ffa8a23/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.925814 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffd170be-0f58-4016-a451-5fb1f7fd9f1b" path="/var/lib/kubelet/pods/ffd170be-0f58-4016-a451-5fb1f7fd9f1b/volumes" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.936237 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.969270 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d3ec8857-5a33-44ea-bdd0-97b343adfc8a/ovsdbserver-nb/0.log" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.969353 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990556 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990599 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990615 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990628 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qvr99"] Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.990914 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" gracePeriod=30 Feb 20 07:11:25 crc kubenswrapper[5094]: I0220 07:11:25.997866 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.010788 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.013622 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.017788 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 07:11:26 crc kubenswrapper[5094]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: if [ -n "glance" ]; then Feb 20 07:11:26 crc kubenswrapper[5094]: GRANT_DATABASE="glance" Feb 20 07:11:26 crc kubenswrapper[5094]: else Feb 20 07:11:26 crc kubenswrapper[5094]: GRANT_DATABASE="*" Feb 20 07:11:26 crc kubenswrapper[5094]: fi Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: # going for maximum compatibility here: Feb 20 07:11:26 crc kubenswrapper[5094]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 07:11:26 crc kubenswrapper[5094]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 07:11:26 crc kubenswrapper[5094]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 07:11:26 crc kubenswrapper[5094]: # support updates Feb 20 07:11:26 crc kubenswrapper[5094]: Feb 20 07:11:26 crc kubenswrapper[5094]: $MYSQL_CMD < logger="UnhandledError" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.019047 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-1fa0-account-create-update-zvvj2" podUID="290e5022-8d17-4415-87c0-07891b0b66f5" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.022784 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.061455 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lb6l5"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.081237 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c179-account-create-update-sst4m"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.081328 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.083305 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f8ca33ba-f76e-4352-b6f1-54588dd25285" (UID: "f8ca33ba-f76e-4352-b6f1-54588dd25285"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.108118 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.131652 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.144029 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145200 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145222 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145248 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145279 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145342 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145367 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.145491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") pod \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\" (UID: \"d3ec8857-5a33-44ea-bdd0-97b343adfc8a\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.146544 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f8ca33ba-f76e-4352-b6f1-54588dd25285-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.149384 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "8ecb5d91-5ba1-457e-af42-0d78c8643250" (UID: "8ecb5d91-5ba1-457e-af42-0d78c8643250"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.157437 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.160981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config" (OuterVolumeSpecName: "config") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.166562 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9" (OuterVolumeSpecName: "kube-api-access-55pw9") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "kube-api-access-55pw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.169991 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts" (OuterVolumeSpecName: "scripts") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.172533 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-wrxqf"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.185130 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "ca32118b-2e77-4484-b753-3467e1ba8df1" (UID: "ca32118b-2e77-4484-b753-3467e1ba8df1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.222202 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.224542 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" containerID="cri-o://bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" gracePeriod=604800 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.247956 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-p4nhd"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252137 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55pw9\" (UniqueName: \"kubernetes.io/projected/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-kube-api-access-55pw9\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252185 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ecb5d91-5ba1-457e-af42-0d78c8643250-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252198 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca32118b-2e77-4484-b753-3467e1ba8df1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252212 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252226 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252255 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.252269 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.295040 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6e9d-account-create-update-f9bgk"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.305196 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.307622 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.308084 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.310841 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.315913 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.347262 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.356054 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.356133 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.356810 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.359132 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.359159 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.378270 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="galera" containerID="cri-o://f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.414160 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.415664 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.417075 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-pk97g"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.421735 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.426436 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.433974 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rfczb"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.439873 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d3ec8857-5a33-44ea-bdd0-97b343adfc8a" (UID: "d3ec8857-5a33-44ea-bdd0-97b343adfc8a"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.446586 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.447121 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" containerID="cri-o://c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.465431 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.465469 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3ec8857-5a33-44ea-bdd0-97b343adfc8a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.465543 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.465596 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:30.465578077 +0000 UTC m=+1505.338204788 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.518148 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cadd011d-8dde-4346-8608-c5f74376204d/ovsdbserver-sb/0.log" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.518237 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.574749 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.574828 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.574902 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.575127 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.575185 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.575235 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") pod \"37533bd5-22b5-4b59-8672-35eaa19b9295\" (UID: \"37533bd5-22b5-4b59-8672-35eaa19b9295\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.609021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42" (OuterVolumeSpecName: "kube-api-access-djp42") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "kube-api-access-djp42". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.610191 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.629945 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.656253 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.662533 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-lvlr2"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.672891 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678413 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678459 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678521 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678646 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678793 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678830 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678851 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.678938 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") pod \"cadd011d-8dde-4346-8608-c5f74376204d\" (UID: \"cadd011d-8dde-4346-8608-c5f74376204d\") " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.679361 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djp42\" (UniqueName: \"kubernetes.io/projected/37533bd5-22b5-4b59-8672-35eaa19b9295-kube-api-access-djp42\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.687743 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.688012 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6964856c75-f7xdp" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" containerID="cri-o://e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.688556 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6964856c75-f7xdp" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" containerID="cri-o://9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" gracePeriod=30 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.691617 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-gg2h9"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.692815 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts" (OuterVolumeSpecName: "scripts") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.693245 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.693568 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config" (OuterVolumeSpecName: "config") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.697935 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.712352 5094 generic.go:334] "Generic (PLEG): container finished" podID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerID="7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.712419 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerDied","Data":"7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.718084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz" (OuterVolumeSpecName: "kube-api-access-s4rzz") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "kube-api-access-s4rzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728276 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728326 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728336 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728347 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728436 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728541 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728646 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.728660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.735068 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d3ec8857-5a33-44ea-bdd0-97b343adfc8a/ovsdbserver-nb/0.log" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.735196 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d3ec8857-5a33-44ea-bdd0-97b343adfc8a","Type":"ContainerDied","Data":"c486e86d620d37448242e0209a1c7bf77c53c9f654c68f860c22b2ce1ff67ce9"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.735267 5094 scope.go:117] "RemoveContainer" containerID="d8891902715ab819be070436bd6baa8c209b3026708544fcd16518b5902e4976" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.735309 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.760650 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7677694455-llk7m" event={"ID":"37533bd5-22b5-4b59-8672-35eaa19b9295","Type":"ContainerDied","Data":"11d132d9afa27137f856e4e3ac63fa1a46eebfeb7ef403dea2957ddcdaf2acba"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.760856 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7677694455-llk7m" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.768721 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config" (OuterVolumeSpecName: "config") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.783921 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.783962 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cadd011d-8dde-4346-8608-c5f74376204d-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.783975 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.783991 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4rzz\" (UniqueName: \"kubernetes.io/projected/cadd011d-8dde-4346-8608-c5f74376204d-kube-api-access-s4rzz\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.784004 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cadd011d-8dde-4346-8608-c5f74376204d-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.784017 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.795653 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cadd011d-8dde-4346-8608-c5f74376204d/ovsdbserver-sb/0.log" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.795751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cadd011d-8dde-4346-8608-c5f74376204d","Type":"ContainerDied","Data":"066aae04f2be17dda3fe7b7198c86cbe95ffc64bcd807181570b9426d3daf04a"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.795837 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.806944 5094 generic.go:334] "Generic (PLEG): container finished" podID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerID="459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.807061 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerDied","Data":"459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.821407 5094 generic.go:334] "Generic (PLEG): container finished" podID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerID="c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.821564 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerDied","Data":"c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.827408 5094 scope.go:117] "RemoveContainer" containerID="87a953c9ab5036126a76e97a66b02ef98a2f8720e10ee1bc11a373190ac13d0d" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.828471 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-zvvj2" event={"ID":"290e5022-8d17-4415-87c0-07891b0b66f5","Type":"ContainerStarted","Data":"53f6cb6ae1d8a22199f485c77e321317380823b606f56ad3e800d4265b641405"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.834753 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.850075 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.866095 5094 generic.go:334] "Generic (PLEG): container finished" podID="530069d2-7146-46eb-9c88-056cc8a583b2" containerID="5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.866169 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerDied","Data":"5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.872066 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vc6z" event={"ID":"0330a367-c0c9-42a9-9993-1a3b6775fd3b","Type":"ContainerStarted","Data":"09536db7b188d6d7e190e80f4a66c3a18be0cf58fe509f674089f0d4cd9626eb"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.873364 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6964856c75-f7xdp" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.873746 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-6964856c75-f7xdp" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" probeResult="failure" output="Get \"https://10.217.0.169:8080/healthcheck\": dial tcp 10.217.0.169:8080: connect: connection refused" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.875575 5094 generic.go:334] "Generic (PLEG): container finished" podID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.875653 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerDied","Data":"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.889884 5094 generic.go:334] "Generic (PLEG): container finished" podID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerID="c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.889963 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerDied","Data":"c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663"} Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.898364 5094 scope.go:117] "RemoveContainer" containerID="fabd932b0a47a0bf51f49258d5ff3d5003b1b88996a45ebe155061bab41c04ff" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.899212 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.906395 5094 generic.go:334] "Generic (PLEG): container finished" podID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerID="0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23" exitCode=0 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.906507 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerDied","Data":"0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23"} Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.969770 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.971290 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.971570 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.980159 5094 generic.go:334] "Generic (PLEG): container finished" podID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerID="e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32" exitCode=143 Feb 20 07:11:26 crc kubenswrapper[5094]: I0220 07:11:26.980247 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerDied","Data":"e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32"} Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.980302 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.980349 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.980911 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.989861 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:26 crc kubenswrapper[5094]: E0220 07:11:26.989926 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.001097 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.004398 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.010308 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.010342 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.010351 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.010371 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.010404 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.010381914 +0000 UTC m=+1505.883008625 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.018008 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.039075 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.054302 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 07:11:27 crc kubenswrapper[5094]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:0f7943e02fbdd3daec1d3db72fa9396bf37ad3fdd6b0f3119c90e29629e095ed,Command:[/bin/sh -c #!/bin/bash Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: MYSQL_CMD="mysql -h -u root -P 3306" Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: if [ -n "nova_api" ]; then Feb 20 07:11:27 crc kubenswrapper[5094]: GRANT_DATABASE="nova_api" Feb 20 07:11:27 crc kubenswrapper[5094]: else Feb 20 07:11:27 crc kubenswrapper[5094]: GRANT_DATABASE="*" Feb 20 07:11:27 crc kubenswrapper[5094]: fi Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: # going for maximum compatibility here: Feb 20 07:11:27 crc kubenswrapper[5094]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Feb 20 07:11:27 crc kubenswrapper[5094]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Feb 20 07:11:27 crc kubenswrapper[5094]: # 3. create user with CREATE but then do all password and TLS with ALTER to Feb 20 07:11:27 crc kubenswrapper[5094]: # support updates Feb 20 07:11:27 crc kubenswrapper[5094]: Feb 20 07:11:27 crc kubenswrapper[5094]: $MYSQL_CMD < logger="UnhandledError" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.055996 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-d63e-account-create-update-tvf55" podUID="2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.087916 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37533bd5-22b5-4b59-8672-35eaa19b9295" (UID: "37533bd5-22b5-4b59-8672-35eaa19b9295"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.088051 5094 scope.go:117] "RemoveContainer" containerID="a4c4f92b36bb2b7d701dbf8c3f7817a427ce69bddf6bb34e82a6884705e2608c" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.097334 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.100081 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "cadd011d-8dde-4346-8608-c5f74376204d" (UID: "cadd011d-8dde-4346-8608-c5f74376204d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.114626 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.114720 5094 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.115014 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37533bd5-22b5-4b59-8672-35eaa19b9295-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.115024 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.115039 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cadd011d-8dde-4346-8608-c5f74376204d-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.220384 5094 scope.go:117] "RemoveContainer" containerID="2a6097f5aaeab1082991a6d095181e3f753899eb178f0d829dd1ed8b74f20e47" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.295930 5094 scope.go:117] "RemoveContainer" containerID="333434623dc65ba599c492292fd799a78a2d1d5581438ea036aa6124a9583e68" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.401957 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.409610 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7677694455-llk7m"] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.521797 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.528821 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.635006 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.730818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") pod \"290e5022-8d17-4415-87c0-07891b0b66f5\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.731030 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") pod \"290e5022-8d17-4415-87c0-07891b0b66f5\" (UID: \"290e5022-8d17-4415-87c0-07891b0b66f5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.733851 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "290e5022-8d17-4415-87c0-07891b0b66f5" (UID: "290e5022-8d17-4415-87c0-07891b0b66f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.743472 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt" (OuterVolumeSpecName: "kube-api-access-497kt") pod "290e5022-8d17-4415-87c0-07891b0b66f5" (UID: "290e5022-8d17-4415-87c0-07891b0b66f5"). InnerVolumeSpecName "kube-api-access-497kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.834312 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-497kt\" (UniqueName: \"kubernetes.io/projected/290e5022-8d17-4415-87c0-07891b0b66f5-kube-api-access-497kt\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.834361 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/290e5022-8d17-4415-87c0-07891b0b66f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834476 5094 projected.go:263] Couldn't get secret openstack/swift-proxy-config-data: secret "swift-proxy-config-data" not found Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834493 5094 projected.go:263] Couldn't get secret openstack/swift-conf: secret "swift-conf" not found Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834505 5094 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834519 5094 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-proxy-6964856c75-f7xdp: [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:27 crc kubenswrapper[5094]: E0220 07:11:27.834582 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift podName:8e30bbcd-c206-4a74-ae52-21462356babf nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.834559733 +0000 UTC m=+1506.707186434 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift") pod "swift-proxy-6964856c75-f7xdp" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf") : [secret "swift-proxy-config-data" not found, secret "swift-conf" not found, configmap "swift-ring-files" not found] Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.879438 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ff73f2-0b55-4d81-9342-92dbe47435f0" path="/var/lib/kubelet/pods/20ff73f2-0b55-4d81-9342-92dbe47435f0/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.880344 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b230f5-7de4-450c-90e6-e9c18a0d9c0e" path="/var/lib/kubelet/pods/32b230f5-7de4-450c-90e6-e9c18a0d9c0e/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.881123 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" path="/var/lib/kubelet/pods/37533bd5-22b5-4b59-8672-35eaa19b9295/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.882323 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43cfca6d-55e3-431f-b5b8-2b8db44bcee0" path="/var/lib/kubelet/pods/43cfca6d-55e3-431f-b5b8-2b8db44bcee0/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.882879 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="462ace9b-51c7-4cd0-850a-65d714c5f3b6" path="/var/lib/kubelet/pods/462ace9b-51c7-4cd0-850a-65d714c5f3b6/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.883579 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c0f5daa-28f1-412d-8749-5b11f6b8f26d" path="/var/lib/kubelet/pods/5c0f5daa-28f1-412d-8749-5b11f6b8f26d/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.884300 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c08d54-fdef-4808-bf52-f8ea0894af36" path="/var/lib/kubelet/pods/74c08d54-fdef-4808-bf52-f8ea0894af36/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.885589 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75a27624-eac7-47c9-9f3b-98604d88fb3a" path="/var/lib/kubelet/pods/75a27624-eac7-47c9-9f3b-98604d88fb3a/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.893795 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.904412 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.905749 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" path="/var/lib/kubelet/pods/8ecb5d91-5ba1-457e-af42-0d78c8643250/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.907548 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b534507-5d2d-496b-9a60-f0b45e25bb23" path="/var/lib/kubelet/pods/9b534507-5d2d-496b-9a60-f0b45e25bb23/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.908849 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" path="/var/lib/kubelet/pods/ca32118b-2e77-4484-b753-3467e1ba8df1/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.910269 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cadd011d-8dde-4346-8608-c5f74376204d" path="/var/lib/kubelet/pods/cadd011d-8dde-4346-8608-c5f74376204d/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.911387 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" path="/var/lib/kubelet/pods/d3ec8857-5a33-44ea-bdd0-97b343adfc8a/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.919932 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8ca33ba-f76e-4352-b6f1-54588dd25285" path="/var/lib/kubelet/pods/f8ca33ba-f76e-4352-b6f1-54588dd25285/volumes" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.927769 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.944842 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.944914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.944999 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945031 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945087 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945153 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945223 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945269 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945321 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945386 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945419 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") pod \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\" (UID: \"3db6d35c-dfd1-4a59-95d3-cc8a99151c12\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945444 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945469 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945512 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945678 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945726 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945757 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945791 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945849 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") pod \"8e30bbcd-c206-4a74-ae52-21462356babf\" (UID: \"8e30bbcd-c206-4a74-ae52-21462356babf\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.945892 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") pod \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\" (UID: \"30e79ba9-83fc-4246-9fb2-7136f6ae30a5\") " Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.952503 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.954440 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.966863 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2" (OuterVolumeSpecName: "kube-api-access-44cs2") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "kube-api-access-44cs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.968767 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.969023 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.971868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.980026 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b" (OuterVolumeSpecName: "kube-api-access-lk69b") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "kube-api-access-lk69b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:27 crc kubenswrapper[5094]: I0220 07:11:27.994942 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.019271 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r" (OuterVolumeSpecName: "kube-api-access-fss5r") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "kube-api-access-fss5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.052680 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.066619 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data" (OuterVolumeSpecName: "config-data") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067283 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067335 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44cs2\" (UniqueName: \"kubernetes.io/projected/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-kube-api-access-44cs2\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067346 5094 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067357 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067367 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk69b\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-kube-api-access-lk69b\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067382 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.067391 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.068585 5094 generic.go:334] "Generic (PLEG): container finished" podID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerID="2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.068759 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3db6d35c-dfd1-4a59-95d3-cc8a99151c12","Type":"ContainerDied","Data":"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.068831 5094 scope.go:117] "RemoveContainer" containerID="2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.069074 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.069221 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3db6d35c-dfd1-4a59-95d3-cc8a99151c12","Type":"ContainerDied","Data":"d1d010e8fd8bc9707a8121e22dcd018ce86f613e9bbcc45a6bd9ab2c3e354582"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.070837 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8e30bbcd-c206-4a74-ae52-21462356babf-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.071087 5094 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8e30bbcd-c206-4a74-ae52-21462356babf-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.071119 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fss5r\" (UniqueName: \"kubernetes.io/projected/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-kube-api-access-fss5r\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.077624 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "mysql-db") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.115727 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.178:9292/healthcheck\": read tcp 10.217.0.2:52950->10.217.0.178:9292: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.115758 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.178:9292/healthcheck\": read tcp 10.217.0.2:52948->10.217.0.178:9292: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.121529 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.123979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.151977 5094 generic.go:334] "Generic (PLEG): container finished" podID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerID="f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.152102 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerDied","Data":"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.152139 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"30e79ba9-83fc-4246-9fb2-7136f6ae30a5","Type":"ContainerDied","Data":"ab6b9e58f533ca8387a46ffe5e0cb304794c4450b59c803c80417c57e86e76ef"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.152233 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.165020 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.176926 5094 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.176954 5094 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.176963 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.176974 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.177000 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.181967 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-1fa0-account-create-update-zvvj2" event={"ID":"290e5022-8d17-4415-87c0-07891b0b66f5","Type":"ContainerDied","Data":"53f6cb6ae1d8a22199f485c77e321317380823b606f56ad3e800d4265b641405"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.182121 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-1fa0-account-create-update-zvvj2" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197415 5094 generic.go:334] "Generic (PLEG): container finished" podID="8e30bbcd-c206-4a74-ae52-21462356babf" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197463 5094 generic.go:334] "Generic (PLEG): container finished" podID="8e30bbcd-c206-4a74-ae52-21462356babf" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197538 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerDied","Data":"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197593 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerDied","Data":"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197605 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6964856c75-f7xdp" event={"ID":"8e30bbcd-c206-4a74-ae52-21462356babf","Type":"ContainerDied","Data":"d9175bae901767e94daa13c4878599492cbc5a434fa253663ae2586b3df20eb3"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.197687 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6964856c75-f7xdp" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.256252 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data" (OuterVolumeSpecName: "config-data") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.256851 5094 generic.go:334] "Generic (PLEG): container finished" podID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerID="91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e" exitCode=0 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.256948 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerDied","Data":"91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.274216 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.281070 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.281105 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.286135 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "30e79ba9-83fc-4246-9fb2-7136f6ae30a5" (UID: "30e79ba9-83fc-4246-9fb2-7136f6ae30a5"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.319913 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3db6d35c-dfd1-4a59-95d3-cc8a99151c12" (UID: "3db6d35c-dfd1-4a59-95d3-cc8a99151c12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.339542 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.354891 5094 generic.go:334] "Generic (PLEG): container finished" podID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" exitCode=1 Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.355002 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vc6z" event={"ID":"0330a367-c0c9-42a9-9993-1a3b6775fd3b","Type":"ContainerDied","Data":"8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce"} Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.355777 5094 scope.go:117] "RemoveContainer" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.386011 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.398871 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.398910 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.398923 5094 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e79ba9-83fc-4246-9fb2-7136f6ae30a5-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.398933 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db6d35c-dfd1-4a59-95d3-cc8a99151c12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.484811 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e30bbcd-c206-4a74-ae52-21462356babf" (UID: "8e30bbcd-c206-4a74-ae52-21462356babf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.505675 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e30bbcd-c206-4a74-ae52-21462356babf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.629371 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": read tcp 10.217.0.2:47654->10.217.0.177:9292: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.629807 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.177:9292/healthcheck\": read tcp 10.217.0.2:47664->10.217.0.177:9292: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.634738 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:34896->10.217.0.204:8775: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.634918 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:34910->10.217.0.204:8775: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.832412 5094 scope.go:117] "RemoveContainer" containerID="2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" Feb 20 07:11:28 crc kubenswrapper[5094]: E0220 07:11:28.832904 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf\": container with ID starting with 2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf not found: ID does not exist" containerID="2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.832936 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf"} err="failed to get container status \"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf\": rpc error: code = NotFound desc = could not find container \"2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf\": container with ID starting with 2bdc91fbdf4016ef04e3a2baae7cf5e5ae714534246ea953292e2628283eecdf not found: ID does not exist" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.832974 5094 scope.go:117] "RemoveContainer" containerID="f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.848917 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:52130->10.217.0.159:9311: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.849226 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:52138->10.217.0.159:9311: read: connection reset by peer" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.873250 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.901069 5094 scope.go:117] "RemoveContainer" containerID="1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924363 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924614 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924681 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924890 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924955 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924980 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.924997 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") pod \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\" (UID: \"b90c9110-e4fd-461b-ad2c-a58ff01921d8\") " Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.927006 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs" (OuterVolumeSpecName: "logs") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.944662 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts" (OuterVolumeSpecName: "scripts") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:28 crc kubenswrapper[5094]: I0220 07:11:28.944879 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6" (OuterVolumeSpecName: "kube-api-access-qxpk6") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "kube-api-access-qxpk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.033251 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b90c9110-e4fd-461b-ad2c-a58ff01921d8-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.033751 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.033765 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxpk6\" (UniqueName: \"kubernetes.io/projected/b90c9110-e4fd-461b-ad2c-a58ff01921d8-kube-api-access-qxpk6\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.042194 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.044264 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.051390 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.105244 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data" (OuterVolumeSpecName: "config-data") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.122202 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.144120 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.144150 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.174004 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-1fa0-account-create-update-zvvj2"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.210882 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.211212 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-central-agent" containerID="cri-o://f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.214231 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" containerID="cri-o://4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.214321 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="sg-core" containerID="cri-o://0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.214377 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-notification-agent" containerID="cri-o://b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.233406 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.241296 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.246259 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.265901 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.296826 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.297191 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1fe9db54-4204-4335-a272-c469e0923478" containerName="kube-state-metrics" containerID="cri-o://0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.325186 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.344211 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.163:8776/healthcheck\": dial tcp 10.217.0.163:8776: connect: connection refused" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.362238 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6964856c75-f7xdp"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.387179 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b90c9110-e4fd-461b-ad2c-a58ff01921d8" (UID: "b90c9110-e4fd-461b-ad2c-a58ff01921d8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.390119 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.390469 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerName="memcached" containerID="cri-o://631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.424613 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.455553 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b90c9110-e4fd-461b-ad2c-a58ff01921d8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.466373 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6cc2-account-create-update-vqjjf"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473147 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473634 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-log" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473648 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-log" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473664 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="ovsdbserver-sb" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473670 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="ovsdbserver-sb" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473683 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473689 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473913 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="init" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473923 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="init" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473945 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473951 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.473968 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.473999 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474012 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="ovsdbserver-nb" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474020 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="ovsdbserver-nb" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474028 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474034 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474046 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="galera" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474052 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="galera" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474068 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-api" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474074 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-api" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474085 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474092 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474101 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474107 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474114 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474122 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474135 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474141 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.474152 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="mysql-bootstrap" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474166 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="mysql-bootstrap" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474334 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474348 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca32118b-2e77-4484-b753-3467e1ba8df1" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474360 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474370 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474381 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-api" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474393 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="ovsdbserver-nb" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474402 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-httpd" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474412 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3ec8857-5a33-44ea-bdd0-97b343adfc8a" containerName="openstack-network-exporter" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474420 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" containerName="galera" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474429 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" containerName="proxy-server" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474441 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" containerName="placement-log" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474452 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ecb5d91-5ba1-457e-af42-0d78c8643250" containerName="ovn-controller" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.474462 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cadd011d-8dde-4346-8608-c5f74376204d" containerName="ovsdbserver-sb" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.475140 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.477581 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.492371 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.500921 5094 generic.go:334] "Generic (PLEG): container finished" podID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerID="50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.501057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerDied","Data":"50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.515786 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.529082 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xnj8t"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.544463 5094 generic.go:334] "Generic (PLEG): container finished" podID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerID="5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.545000 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.545037 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerDied","Data":"5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.547759 5094 scope.go:117] "RemoveContainer" containerID="f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.557820 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14\": container with ID starting with f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14 not found: ID does not exist" containerID="f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.557874 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14"} err="failed to get container status \"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14\": rpc error: code = NotFound desc = could not find container \"f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14\": container with ID starting with f1ef94c055965af4c60a329875d73bb23ffc248862090105b8c3a3484d931c14 not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.557908 5094 scope.go:117] "RemoveContainer" containerID="1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.559278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.559351 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.562502 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-plbtm"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.569462 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.572925 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8\": container with ID starting with 1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8 not found: ID does not exist" containerID="1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.572977 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8"} err="failed to get container status \"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8\": rpc error: code = NotFound desc = could not find container \"1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8\": container with ID starting with 1a298d17b5fd33f9f36c09a0339c08e7faff16875504e0c410ecbbdc1176b9c8 not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.573013 5094 scope.go:117] "RemoveContainer" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.584533 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.584813 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-55468cd684-wv6dn" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" containerID="cri-o://d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051" gracePeriod=30 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.608277 5094 generic.go:334] "Generic (PLEG): container finished" podID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerID="b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.608371 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerDied","Data":"b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08"} Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.608723 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.615319 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.619481 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.619517 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.619508 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.627176 5094 generic.go:334] "Generic (PLEG): container finished" podID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerID="d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.627283 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerDied","Data":"d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.633556 5094 generic.go:334] "Generic (PLEG): container finished" podID="1218d679-0e51-4bef-9526-db16c8783d8b" containerID="0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" exitCode=2 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.633632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.636886 5094 generic.go:334] "Generic (PLEG): container finished" podID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerID="92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.636953 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerDied","Data":"92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.655517 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.661486 5094 generic.go:334] "Generic (PLEG): container finished" podID="762a565c-672e-4127-a8c6-90f721eeda81" containerID="d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.661575 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerDied","Data":"d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.661765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.667776 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.667855 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.668211 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.668273 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:30.168252959 +0000 UTC m=+1505.040879670 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : configmap "openstack-scripts" not found Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.669323 5094 generic.go:334] "Generic (PLEG): container finished" podID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerID="a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.672971 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerDied","Data":"a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.679241 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-kbc28"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.676227 5094 scope.go:117] "RemoveContainer" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.683012 5094 projected.go:194] Error preparing data for projected volume kube-api-access-tvlxh for pod openstack/keystone-6cc2-account-create-update-98lt6: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.683120 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:30.183088784 +0000 UTC m=+1505.055715495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tvlxh" (UniqueName: "kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.706559 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5b8f9d577d-pgn2k" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.706645 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5b8f9d577d-pgn2k" event={"ID":"b90c9110-e4fd-461b-ad2c-a58ff01921d8","Type":"ContainerDied","Data":"b019c482612055cc0918048f8a12a69d96710169b8244b6ca81050099107cecc"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.727400 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.735120 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-tvlxh operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-6cc2-account-create-update-98lt6" podUID="c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.765098 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.769420 5094 scope.go:117] "RemoveContainer" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770388 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770453 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770538 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") pod \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770569 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770683 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770721 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") pod \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\" (UID: \"f11aa87b-3964-4a62-871f-bdf7d1ad7848\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.770802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") pod \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\" (UID: \"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.779222 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.779348 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": container with ID starting with 9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c not found: ID does not exist" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.779379 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c"} err="failed to get container status \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": rpc error: code = NotFound desc = could not find container \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": container with ID starting with 9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.779409 5094 scope.go:117] "RemoveContainer" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.781438 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" (UID: "2a1b6a8a-aefe-4f59-8936-f08aed30d8f7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: E0220 07:11:29.781542 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": container with ID starting with e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96 not found: ID does not exist" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.781570 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96"} err="failed to get container status \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": rpc error: code = NotFound desc = could not find container \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": container with ID starting with e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96 not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.781587 5094 scope.go:117] "RemoveContainer" containerID="9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.782556 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs" (OuterVolumeSpecName: "logs") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793483 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd" (OuterVolumeSpecName: "kube-api-access-r7nxd") pod "2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" (UID: "2a1b6a8a-aefe-4f59-8936-f08aed30d8f7"). InnerVolumeSpecName "kube-api-access-r7nxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793605 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c"} err="failed to get container status \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": rpc error: code = NotFound desc = could not find container \"9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c\": container with ID starting with 9c55371c9ce95c67d6ad66dc073054f663e0d346ff30a9a243f2a1ced82a302c not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793675 5094 scope.go:117] "RemoveContainer" containerID="e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793915 5094 generic.go:334] "Generic (PLEG): container finished" podID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerID="d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15" exitCode=0 Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.793977 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerDied","Data":"d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15"} Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.806153 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.806278 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96"} err="failed to get container status \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": rpc error: code = NotFound desc = could not find container \"e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96\": container with ID starting with e86e0d4565ee4890c6f4c219582f6adefb1fd06ffe5ccee10384f63917ca7d96 not found: ID does not exist" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.806309 5094 scope.go:117] "RemoveContainer" containerID="91947cfb40227973fecb8ecdcb4c5ccb4aacad2780a5f8683eda8de6a99c2b2e" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.813897 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.818551 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx" (OuterVolumeSpecName: "kube-api-access-gq8dx") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "kube-api-access-gq8dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.820644 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5b8f9d577d-pgn2k"] Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.850751 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data" (OuterVolumeSpecName: "config-data") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874568 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874616 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874650 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874754 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874791 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874824 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874867 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.874950 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") pod \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\" (UID: \"5d9f1f40-92cc-4f19-9f3b-49651f56bffb\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.876368 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs" (OuterVolumeSpecName: "logs") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.877402 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd5a2da-23e3-4bbd-a211-b3f527e4e28b" path="/var/lib/kubelet/pods/0dd5a2da-23e3-4bbd-a211-b3f527e4e28b/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.878118 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="290e5022-8d17-4415-87c0-07891b0b66f5" path="/var/lib/kubelet/pods/290e5022-8d17-4415-87c0-07891b0b66f5/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.878529 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs" (OuterVolumeSpecName: "logs") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.878603 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e79ba9-83fc-4246-9fb2-7136f6ae30a5" path="/var/lib/kubelet/pods/30e79ba9-83fc-4246-9fb2-7136f6ae30a5/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.879806 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3db6d35c-dfd1-4a59-95d3-cc8a99151c12" path="/var/lib/kubelet/pods/3db6d35c-dfd1-4a59-95d3-cc8a99151c12/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.880347 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="403a4371-09f4-4206-8d60-5b970d7e4faf" path="/var/lib/kubelet/pods/403a4371-09f4-4206-8d60-5b970d7e4faf/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.880912 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="876bc507-6cf2-466a-9cd3-6131a1cc590e" path="/var/lib/kubelet/pods/876bc507-6cf2-466a-9cd3-6131a1cc590e/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.881430 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e30bbcd-c206-4a74-ae52-21462356babf" path="/var/lib/kubelet/pods/8e30bbcd-c206-4a74-ae52-21462356babf/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882187 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882214 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882249 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\" (UID: \"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4\") " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882852 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882869 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882880 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7nxd\" (UniqueName: \"kubernetes.io/projected/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-kube-api-access-r7nxd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882893 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gq8dx\" (UniqueName: \"kubernetes.io/projected/f11aa87b-3964-4a62-871f-bdf7d1ad7848-kube-api-access-gq8dx\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882902 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882910 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f11aa87b-3964-4a62-871f-bdf7d1ad7848-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.882919 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.884448 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0e18d8b-2657-4e87-b6ca-009df89bbac8" path="/var/lib/kubelet/pods/a0e18d8b-2657-4e87-b6ca-009df89bbac8/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.885014 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b90c9110-e4fd-461b-ad2c-a58ff01921d8" path="/var/lib/kubelet/pods/b90c9110-e4fd-461b-ad2c-a58ff01921d8/volumes" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.887062 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.898578 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m" (OuterVolumeSpecName: "kube-api-access-9bf7m") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "kube-api-access-9bf7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.898788 5094 scope.go:117] "RemoveContainer" containerID="d1984bf7ea49205773f404c2e26dd668c4e64dc0f98adc5a3b111f4efeded4ec" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.904204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.918152 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56" (OuterVolumeSpecName: "kube-api-access-z8j56") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "kube-api-access-z8j56". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.942541 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.946791 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts" (OuterVolumeSpecName: "scripts") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.984954 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.985946 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.985978 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986014 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986024 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bf7m\" (UniqueName: \"kubernetes.io/projected/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-kube-api-access-9bf7m\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986037 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8j56\" (UniqueName: \"kubernetes.io/projected/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-kube-api-access-z8j56\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986047 5094 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:29 crc kubenswrapper[5094]: I0220 07:11:29.986057 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.014936 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.091752 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.091896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.091968 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.092006 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.092026 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.092095 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.092147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") pod \"908e2706-d24f-41c9-b481-4c0d5415c5ca\" (UID: \"908e2706-d24f-41c9-b481-4c0d5415c5ca\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.093495 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs" (OuterVolumeSpecName: "logs") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.095712 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data" (OuterVolumeSpecName: "config-data") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.097918 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.099322 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d" (OuterVolumeSpecName: "kube-api-access-4ns5d") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "kube-api-access-4ns5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.143665 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.164121 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200166 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200748 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200833 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ns5d\" (UniqueName: \"kubernetes.io/projected/908e2706-d24f-41c9-b481-4c0d5415c5ca-kube-api-access-4ns5d\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200845 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200854 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200864 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200875 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/908e2706-d24f-41c9-b481-4c0d5415c5ca-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.200884 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.200301 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.201337 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.201302604 +0000 UTC m=+1506.073929315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : configmap "openstack-scripts" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.205171 5094 projected.go:194] Error preparing data for projected volume kube-api-access-tvlxh for pod openstack/keystone-6cc2-account-create-update-98lt6: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.205576 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.205217448 +0000 UTC m=+1506.077844149 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tvlxh" (UniqueName: "kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.206009 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d9f1f40-92cc-4f19-9f3b-49651f56bffb" (UID: "5d9f1f40-92cc-4f19-9f3b-49651f56bffb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.214993 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.229570 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.248110 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.248209 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.248680 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f11aa87b-3964-4a62-871f-bdf7d1ad7848" (UID: "f11aa87b-3964-4a62-871f-bdf7d1ad7848"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.260869 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="galera" containerID="cri-o://b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" gracePeriod=30 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.270076 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.271952 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.302425 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.302456 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f11aa87b-3964-4a62-871f-bdf7d1ad7848-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.302466 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d9f1f40-92cc-4f19-9f3b-49651f56bffb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.302476 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.319058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data" (OuterVolumeSpecName: "config-data") pod "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" (UID: "cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.322164 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.352584 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data" (OuterVolumeSpecName: "config-data") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.404543 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.404584 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.404597 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.418145 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "908e2706-d24f-41c9-b481-4c0d5415c5ca" (UID: "908e2706-d24f-41c9-b481-4c0d5415c5ca"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.461988 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.202:3000/\": dial tcp 10.217.0.202:3000: connect: connection refused" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.507120 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/908e2706-d24f-41c9-b481-4c0d5415c5ca-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.507252 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.507303 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data podName:219c74d6-9f45-4bf8-8c67-acdea3c0fab3 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:38.507287733 +0000 UTC m=+1513.379914444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data") pod "rabbitmq-server-0" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3") : configmap "rabbitmq-config-data" not found Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.551325 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.554795 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.556565 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.556656 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.649338 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.655547 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.674962 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.705191 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711361 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711722 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711765 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711844 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711880 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.711993 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712080 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712123 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712166 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712195 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712217 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712244 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712268 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712305 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712330 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712349 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") pod \"ca3adffb-7baf-45db-ab16-cc1c63510fec\" (UID: \"ca3adffb-7baf-45db-ab16-cc1c63510fec\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") pod \"92877559-6960-4dbf-890a-fb563f4b0bf8\" (UID: \"92877559-6960-4dbf-890a-fb563f4b0bf8\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712396 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"762a565c-672e-4127-a8c6-90f721eeda81\" (UID: \"762a565c-672e-4127-a8c6-90f721eeda81\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.712572 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.713303 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.718451 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.719155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs" (OuterVolumeSpecName: "logs") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.724796 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.726186 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs" (OuterVolumeSpecName: "logs") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.726273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts" (OuterVolumeSpecName: "scripts") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.726197 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z" (OuterVolumeSpecName: "kube-api-access-fmx2z") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "kube-api-access-fmx2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.737204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs" (OuterVolumeSpecName: "logs") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.744063 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm" (OuterVolumeSpecName: "kube-api-access-p8nkm") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "kube-api-access-p8nkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.745511 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw" (OuterVolumeSpecName: "kube-api-access-b2fsw") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "kube-api-access-b2fsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.757054 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.777586 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.792905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data" (OuterVolumeSpecName: "config-data") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.794813 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.801638 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.801666 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814556 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814611 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") pod \"1fe9db54-4204-4335-a272-c469e0923478\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814641 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814660 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") pod \"1fe9db54-4204-4335-a272-c469e0923478\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814769 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814825 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814846 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814899 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.814950 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815002 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815027 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815067 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") pod \"1fe9db54-4204-4335-a272-c469e0923478\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815164 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815259 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") pod \"1fe9db54-4204-4335-a272-c469e0923478\" (UID: \"1fe9db54-4204-4335-a272-c469e0923478\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815302 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815326 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") pod \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\" (UID: \"7dd0ff85-ae3a-4035-a096-fea5952b19a7\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815355 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") pod \"8f8cb333-2939-4404-b242-67bcf4e6875b\" (UID: \"8f8cb333-2939-4404-b242-67bcf4e6875b\") " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815790 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815818 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815828 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmx2z\" (UniqueName: \"kubernetes.io/projected/92877559-6960-4dbf-890a-fb563f4b0bf8-kube-api-access-fmx2z\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815840 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815851 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/92877559-6960-4dbf-890a-fb563f4b0bf8-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815860 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2fsw\" (UniqueName: \"kubernetes.io/projected/762a565c-672e-4127-a8c6-90f721eeda81-kube-api-access-b2fsw\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815869 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815877 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca3adffb-7baf-45db-ab16-cc1c63510fec-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815901 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815911 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815921 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8nkm\" (UniqueName: \"kubernetes.io/projected/ca3adffb-7baf-45db-ab16-cc1c63510fec-kube-api-access-p8nkm\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815931 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.815941 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/762a565c-672e-4127-a8c6-90f721eeda81-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.817578 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs" (OuterVolumeSpecName: "logs") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.817987 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.818019 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.818599 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data" (OuterVolumeSpecName: "config-data") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.824401 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts" (OuterVolumeSpecName: "scripts") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.824652 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8" (OuterVolumeSpecName: "kube-api-access-lb6j8") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "kube-api-access-lb6j8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.826800 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.839086 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6" (OuterVolumeSpecName: "kube-api-access-knph6") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "kube-api-access-knph6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.840652 5094 generic.go:334] "Generic (PLEG): container finished" podID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.840756 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3caa33a-a0ec-4fdc-876b-266724a5af50","Type":"ContainerDied","Data":"d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.841048 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz" (OuterVolumeSpecName: "kube-api-access-gjgxz") pod "1fe9db54-4204-4335-a272-c469e0923478" (UID: "1fe9db54-4204-4335-a272-c469e0923478"). InnerVolumeSpecName "kube-api-access-gjgxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.844317 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data" (OuterVolumeSpecName: "config-data") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.844338 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "762a565c-672e-4127-a8c6-90f721eeda81" (UID: "762a565c-672e-4127-a8c6-90f721eeda81"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.848854 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.853257 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.855089 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.855807 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.865989 5094 generic.go:334] "Generic (PLEG): container finished" podID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerID="2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.866074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerDied","Data":"2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.882829 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ca3adffb-7baf-45db-ab16-cc1c63510fec" (UID: "ca3adffb-7baf-45db-ab16-cc1c63510fec"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.898633 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.899543 5094 generic.go:334] "Generic (PLEG): container finished" podID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerID="e29fe67f4e435861ca66b7f08084602af4c30866304a951f1e2cec0fe4e96725" exitCode=1 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.899647 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vc6z" event={"ID":"0330a367-c0c9-42a9-9993-1a3b6775fd3b","Type":"ContainerDied","Data":"e29fe67f4e435861ca66b7f08084602af4c30866304a951f1e2cec0fe4e96725"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.899761 5094 scope.go:117] "RemoveContainer" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.900226 5094 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-5vc6z" secret="" err="secret \"galera-openstack-dockercfg-cp9vz\" not found" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.900262 5094 scope.go:117] "RemoveContainer" containerID="e29fe67f4e435861ca66b7f08084602af4c30866304a951f1e2cec0fe4e96725" Feb 20 07:11:30 crc kubenswrapper[5094]: E0220 07:11:30.900601 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-create-update pod=root-account-create-update-5vc6z_openstack(0330a367-c0c9-42a9-9993-1a3b6775fd3b)\"" pod="openstack/root-account-create-update-5vc6z" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918181 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918220 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8f8cb333-2939-4404-b242-67bcf4e6875b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918234 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjgxz\" (UniqueName: \"kubernetes.io/projected/1fe9db54-4204-4335-a272-c469e0923478-kube-api-access-gjgxz\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918246 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7df9984bd9-6txsf" event={"ID":"92877559-6960-4dbf-890a-fb563f4b0bf8","Type":"ContainerDied","Data":"5c342dad28df34f1d8d92f5a04877af4cb07a57675de3adb965ed98cfe8eaa77"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.918591 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7df9984bd9-6txsf" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.923998 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f8cb333-2939-4404-b242-67bcf4e6875b-logs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924025 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924037 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924048 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924059 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb6j8\" (UniqueName: \"kubernetes.io/projected/8f8cb333-2939-4404-b242-67bcf4e6875b-kube-api-access-lb6j8\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924094 5094 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924105 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ca3adffb-7baf-45db-ab16-cc1c63510fec-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924114 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924124 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/762a565c-672e-4127-a8c6-90f721eeda81-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924133 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knph6\" (UniqueName: \"kubernetes.io/projected/7dd0ff85-ae3a-4035-a096-fea5952b19a7-kube-api-access-knph6\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.924142 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7dd0ff85-ae3a-4035-a096-fea5952b19a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.938352 5094 generic.go:334] "Generic (PLEG): container finished" podID="1218d679-0e51-4bef-9526-db16c8783d8b" containerID="4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.938396 5094 generic.go:334] "Generic (PLEG): container finished" podID="1218d679-0e51-4bef-9526-db16c8783d8b" containerID="f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.938445 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.938480 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.947075 5094 generic.go:334] "Generic (PLEG): container finished" podID="1fe9db54-4204-4335-a272-c469e0923478" containerID="0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" exitCode=2 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.947187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fe9db54-4204-4335-a272-c469e0923478","Type":"ContainerDied","Data":"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.947223 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1fe9db54-4204-4335-a272-c469e0923478","Type":"ContainerDied","Data":"8a0b13fdbdedc5064e8f68c82ce215006ed4f58e7530fd19fcca453a9915c200"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.947308 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.954124 5094 scope.go:117] "RemoveContainer" containerID="d397c0695b1183c0c759491ae377b37ffc66032936b01b69dbc1bd7fedbcab31" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.956675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" event={"ID":"908e2706-d24f-41c9-b481-4c0d5415c5ca","Type":"ContainerDied","Data":"106993ad972a41a70b0e11997dac58bd4e6ab90384569399045d1bedeaba95e2"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.956872 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-79d9bcd9d4-9jrtt" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.959889 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "1fe9db54-4204-4335-a272-c469e0923478" (UID: "1fe9db54-4204-4335-a272-c469e0923478"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.970504 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fe9db54-4204-4335-a272-c469e0923478" (UID: "1fe9db54-4204-4335-a272-c469e0923478"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.975416 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data" (OuterVolumeSpecName: "config-data") pod "92877559-6960-4dbf-890a-fb563f4b0bf8" (UID: "92877559-6960-4dbf-890a-fb563f4b0bf8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.977989 5094 generic.go:334] "Generic (PLEG): container finished" podID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerID="631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" exitCode=0 Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.978184 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.978226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7dd0ff85-ae3a-4035-a096-fea5952b19a7","Type":"ContainerDied","Data":"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19"} Feb 20 07:11:30 crc kubenswrapper[5094]: I0220 07:11:30.978325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"7dd0ff85-ae3a-4035-a096-fea5952b19a7","Type":"ContainerDied","Data":"c905b3c584b4bdb1a44662bb87e5389e8137126047bfc23039edbbaea024118a"} Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:30.992945 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ca3adffb-7baf-45db-ab16-cc1c63510fec","Type":"ContainerDied","Data":"1139ba82db72b15ac53a4a8cb3e78462a70a685a8a4aaff7d443bdc0aa77867c"} Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:30.993054 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.020324 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.026976 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.027571 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92877559-6960-4dbf-890a-fb563f4b0bf8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.027590 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.027606 5094 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.028551 5094 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.029161 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data podName:a829c6b3-7069-4544-90dc-40ae83aba524 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:39.029138611 +0000 UTC m=+1513.901765322 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data") pod "rabbitmq-cell1-server-0" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524") : configmap "rabbitmq-cell1-config-data" not found Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.029269 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.029391 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts podName:0330a367-c0c9-42a9-9993-1a3b6775fd3b nodeName:}" failed. No retries permitted until 2026-02-20 07:11:31.529362816 +0000 UTC m=+1506.401989527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts") pod "root-account-create-update-5vc6z" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b") : configmap "openstack-scripts" not found Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.031200 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "1fe9db54-4204-4335-a272-c469e0923478" (UID: "1fe9db54-4204-4335-a272-c469e0923478"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.040924 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data" (OuterVolumeSpecName: "config-data") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.042975 5094 scope.go:117] "RemoveContainer" containerID="7fc72402d88effbe5b7f66ca244892cffc5d212981c680da190e5ee72f72b92a" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.046943 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.047224 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.047280 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f11aa87b-3964-4a62-871f-bdf7d1ad7848","Type":"ContainerDied","Data":"cf9b5c579a0d68acf1d445e79a8eef5359bd9a83ecb400c0f57115c47e73ceda"} Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.062936 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.078963 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce\": container with ID starting with 8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce not found: ID does not exist" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.079014 5094 kuberuntime_gc.go:361] "Error getting ContainerStatus for containerID" containerID="8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce" err="rpc error: code = NotFound desc = could not find container \"8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce\": container with ID starting with 8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce not found: ID does not exist" Feb 20 07:11:31 crc kubenswrapper[5094]: E0220 07:11:31.079044 5094 kuberuntime_gc.go:389] "Failed to remove container log dead symlink" err="remove /var/log/containers/root-account-create-update-5vc6z_openstack_mariadb-account-create-update-8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce.log: no such file or directory" path="/var/log/containers/root-account-create-update-5vc6z_openstack_mariadb-account-create-update-8d9599986f47c1f116606b523c4cbef0203820a63f129ca2e7a4866193abc8ce.log" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.103252 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-79d9bcd9d4-9jrtt"] Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.103774 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4","Type":"ContainerDied","Data":"85e8b39c918088f619ee9b44d81cb6828488069406841b038890d491ba98168a"} Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.108015 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.115382 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7677694455-llk7m" podUID="37533bd5-22b5-4b59-8672-35eaa19b9295" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.197:5353: i/o timeout" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.141188 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.141237 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:31 crc kubenswrapper[5094]: I0220 07:11:31.141248 5094 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fe9db54-4204-4335-a272-c469e0923478-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.244210 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"8f8cb333-2939-4404-b242-67bcf4e6875b","Type":"ContainerDied","Data":"32ffade3cb1f0b317c39a0a7865ad0bdf56c99cb77f0149f0bfd6e89ce114623"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.244415 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.246077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.246135 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") pod \"keystone-6cc2-account-create-update-98lt6\" (UID: \"c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6\") " pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.246825 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.246881 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:33.246860175 +0000 UTC m=+1508.119486886 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.260320 5094 projected.go:194] Error preparing data for projected volume kube-api-access-tvlxh for pod openstack/keystone-6cc2-account-create-update-98lt6: failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.260412 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh podName:c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6 nodeName:}" failed. No retries permitted until 2026-02-20 07:11:33.260383289 +0000 UTC m=+1508.133010000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tvlxh" (UniqueName: "kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh") pod "keystone-6cc2-account-create-update-98lt6" (UID: "c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6") : failed to fetch token: serviceaccounts "galera-openstack" not found Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.282621 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"762a565c-672e-4127-a8c6-90f721eeda81","Type":"ContainerDied","Data":"ca1059f5843b1683c2b383612bdd39e42db92c13986bf2a30fa4cc0e0bdde634"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.282782 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.303382 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8f8cb333-2939-4404-b242-67bcf4e6875b" (UID: "8f8cb333-2939-4404-b242-67bcf4e6875b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.317875 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "7dd0ff85-ae3a-4035-a096-fea5952b19a7" (UID: "7dd0ff85-ae3a-4035-a096-fea5952b19a7"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320138 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320621 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320753 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7c9ffdc684-v56nc" event={"ID":"5d9f1f40-92cc-4f19-9f3b-49651f56bffb","Type":"ContainerDied","Data":"9f163abc1efd183ebd2809a660db1c44ccc0e92e53e74d9ce4dfa48299f86759"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320844 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-d63e-account-create-update-tvf55" event={"ID":"2a1b6a8a-aefe-4f59-8936-f08aed30d8f7","Type":"ContainerDied","Data":"9e2e1470fd33e88144567dad9b332e30ed6e81a2d129cafee24b4fca5bfc7939"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.320935 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-d63e-account-create-update-tvf55" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.371955 5094 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/7dd0ff85-ae3a-4035-a096-fea5952b19a7-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.371990 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8f8cb333-2939-4404-b242-67bcf4e6875b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.395327 5094 scope.go:117] "RemoveContainer" containerID="0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.515116 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.532103 5094 scope.go:117] "RemoveContainer" containerID="0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.536234 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9\": container with ID starting with 0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9 not found: ID does not exist" containerID="0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.536284 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9"} err="failed to get container status \"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9\": rpc error: code = NotFound desc = could not find container \"0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9\": container with ID starting with 0fcfeb4fdfabbbcb036fd6cf631f390dac9fcd6e68173a7af3e76810e7a92db9 not found: ID does not exist" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.536313 5094 scope.go:117] "RemoveContainer" containerID="a9359fe0f2ca547fd93214e43c994a9c0538ad96741f69a95262122bb7dc4d7b" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.570207 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.574672 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.574806 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.574907 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.574962 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.575012 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.575349 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") pod \"d01cbaa4-5543-4cd5-b098-7e4600819d32\" (UID: \"d01cbaa4-5543-4cd5-b098-7e4600819d32\") " Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.576372 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.576457 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts podName:0330a367-c0c9-42a9-9993-1a3b6775fd3b nodeName:}" failed. No retries permitted until 2026-02-20 07:11:32.576437787 +0000 UTC m=+1507.449064498 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts") pod "root-account-create-update-5vc6z" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b") : configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.578062 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.593379 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts" (OuterVolumeSpecName: "scripts") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.601251 5094 scope.go:117] "RemoveContainer" containerID="459bc2f2d89a8c0984a77dac5551d18970f5268790591ad15bb4ecd73c5d3e57" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.603994 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.605943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w" (OuterVolumeSpecName: "kube-api-access-f8b4w") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "kube-api-access-f8b4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.616870 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.631535 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.649453 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7df9984bd9-6txsf"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.668498 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683142 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") pod \"f3caa33a-a0ec-4fdc-876b-266724a5af50\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683209 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") pod \"f3caa33a-a0ec-4fdc-876b-266724a5af50\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683412 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") pod \"f3caa33a-a0ec-4fdc-876b-266724a5af50\" (UID: \"f3caa33a-a0ec-4fdc-876b-266724a5af50\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683908 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d01cbaa4-5543-4cd5-b098-7e4600819d32-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683922 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683934 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8b4w\" (UniqueName: \"kubernetes.io/projected/d01cbaa4-5543-4cd5-b098-7e4600819d32-kube-api-access-f8b4w\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.683947 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.684197 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.690597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5" (OuterVolumeSpecName: "kube-api-access-7snx5") pod "f3caa33a-a0ec-4fdc-876b-266724a5af50" (UID: "f3caa33a-a0ec-4fdc-876b-266724a5af50"). InnerVolumeSpecName "kube-api-access-7snx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.692193 5094 scope.go:117] "RemoveContainer" containerID="631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.702422 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.717878 5094 scope.go:117] "RemoveContainer" containerID="631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.718041 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.718693 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19\": container with ID starting with 631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19 not found: ID does not exist" containerID="631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.718741 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19"} err="failed to get container status \"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19\": rpc error: code = NotFound desc = could not find container \"631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19\": container with ID starting with 631fbada038605f5c51cfb450102e200cb6c919e6e8732fe4d271c4b0f19cc19 not found: ID does not exist" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.718768 5094 scope.go:117] "RemoveContainer" containerID="92a9942e0db46da8477438409c4da3d36ea534be827434e466a4bec7f4c990ab" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.741395 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.746867 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data" (OuterVolumeSpecName: "config-data") pod "d01cbaa4-5543-4cd5-b098-7e4600819d32" (UID: "d01cbaa4-5543-4cd5-b098-7e4600819d32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.753464 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data" (OuterVolumeSpecName: "config-data") pod "f3caa33a-a0ec-4fdc-876b-266724a5af50" (UID: "f3caa33a-a0ec-4fdc-876b-266724a5af50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.757481 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3caa33a-a0ec-4fdc-876b-266724a5af50" (UID: "f3caa33a-a0ec-4fdc-876b-266724a5af50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.757523 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.765370 5094 scope.go:117] "RemoveContainer" containerID="c34769ddae56119b12d9b51bd88bf7ef48671807437c5ad0d48869446003d663" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.765574 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.780574 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785926 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785942 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7snx5\" (UniqueName: \"kubernetes.io/projected/f3caa33a-a0ec-4fdc-876b-266724a5af50-kube-api-access-7snx5\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785952 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d01cbaa4-5543-4cd5-b098-7e4600819d32-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785961 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.785969 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3caa33a-a0ec-4fdc-876b-266724a5af50-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.795302 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-d63e-account-create-update-tvf55"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.810484 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.811346 5094 scope.go:117] "RemoveContainer" containerID="50299a2c387cfc7c5adc90764eae8fbdc420d6bc7964e0d04844da05fc246e7d" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.821075 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.827377 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.832668 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.832904 5094 scope.go:117] "RemoveContainer" containerID="229e2aebe241cc126dc9491dbc9e726710e67939495c62f8914e915bce65e45d" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.838757 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.856718 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe9db54-4204-4335-a272-c469e0923478" path="/var/lib/kubelet/pods/1fe9db54-4204-4335-a272-c469e0923478/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.856911 5094 scope.go:117] "RemoveContainer" containerID="5c666f7fe1bbd7b65dc971fe96a85660b5d4b37560d1db6e6ea2ac21f9782b5c" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.857623 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1b6a8a-aefe-4f59-8936-f08aed30d8f7" path="/var/lib/kubelet/pods/2a1b6a8a-aefe-4f59-8936-f08aed30d8f7/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.858327 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" path="/var/lib/kubelet/pods/8f8cb333-2939-4404-b242-67bcf4e6875b/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.858936 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" path="/var/lib/kubelet/pods/908e2706-d24f-41c9-b481-4c0d5415c5ca/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.860314 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" path="/var/lib/kubelet/pods/92877559-6960-4dbf-890a-fb563f4b0bf8/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.861493 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" path="/var/lib/kubelet/pods/ca3adffb-7baf-45db-ab16-cc1c63510fec/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.862444 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" path="/var/lib/kubelet/pods/cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.863164 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" path="/var/lib/kubelet/pods/f11aa87b-3964-4a62-871f-bdf7d1ad7848/volumes" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.864784 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.864814 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.869763 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.872771 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.877824 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-7c9ffdc684-v56nc"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.893194 5094 scope.go:117] "RemoveContainer" containerID="0f4330768d5d427a2d77f0009f9dd9c0b2e0c4e46d6fb295f8aa0f285169a62c" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.918404 5094 scope.go:117] "RemoveContainer" containerID="d19cd01d15b92dc7454d6cf3fb5973b463816ef9279ed3d67fe67ccb7ea9cc15" Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.926273 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.928114 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.928315 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.928676 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.928798 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.933151 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.938304 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:31.938346 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.953131 5094 scope.go:117] "RemoveContainer" containerID="e83bab219c26de7197a4c8d483fd96f4ccf30f290122d4606fa75843efcfaa32" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.974071 5094 scope.go:117] "RemoveContainer" containerID="d1f6db0391d91d17006e26254d5724c2e1250f967872ca29f449b7f22386e51b" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:31.992486 5094 scope.go:117] "RemoveContainer" containerID="fbee36bb89639ece4abdb6a81c5837582f4784307b3986814aecb8c79e38a15c" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.012177 5094 scope.go:117] "RemoveContainer" containerID="b35c284d78ecbe2f2a62876408599b6c0ef8f8ad565bfbbe236f98afa60d9b08" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.076549 5094 scope.go:117] "RemoveContainer" containerID="c53dec831e54b024a954907e297fb4a37cd3b545f865b7a580f6bd56abb0a90d" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.352077 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.351929 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d01cbaa4-5543-4cd5-b098-7e4600819d32","Type":"ContainerDied","Data":"e629f6202467daa64ea1c5522af0e65990925325c9e7a625f6d5ea287157f10f"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.352527 5094 scope.go:117] "RemoveContainer" containerID="0d80098740f3555dc54ef0c65032de273b3ce866905e15dc8682ab9661be5b23" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.378191 5094 generic.go:334] "Generic (PLEG): container finished" podID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerID="74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6" exitCode=0 Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.378384 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerDied","Data":"74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.397587 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f3caa33a-a0ec-4fdc-876b-266724a5af50","Type":"ContainerDied","Data":"04b7e6ec6e40c5ff8ef2a4fd5c6eaae7045f9b9edc65925744ff48699dca0766"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.397845 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.421956 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cd92d75e-9882-4bb7-a41e-cab9777424e8/ovn-northd/0.log" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.421993 5094 generic.go:334] "Generic (PLEG): container finished" podID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" exitCode=139 Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.422044 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerDied","Data":"3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.422068 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cd92d75e-9882-4bb7-a41e-cab9777424e8","Type":"ContainerDied","Data":"152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900"} Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.422081 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="152906df1c7c73cf9f0f0833ac0e10aba2f396ee2bbcbe89c7d223cc8c01b900" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.443953 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6cc2-account-create-update-98lt6" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.552925 5094 scope.go:117] "RemoveContainer" containerID="2d7a76b02a624c041d0a977a291c39fc1dacc4367a61490ce8714745def9a3c7" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.577777 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cd92d75e-9882-4bb7-a41e-cab9777424e8/ovn-northd/0.log" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.577890 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.588158 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.609869 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.609941 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.609989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610025 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610102 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610125 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610155 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610190 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610255 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610300 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610334 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610356 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610438 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") pod \"cd92d75e-9882-4bb7-a41e-cab9777424e8\" (UID: \"cd92d75e-9882-4bb7-a41e-cab9777424e8\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610531 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610582 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.610643 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") pod \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\" (UID: \"219c74d6-9f45-4bf8-8c67-acdea3c0fab3\") " Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:32.611525 5094 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: E0220 07:11:32.611595 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts podName:0330a367-c0c9-42a9-9993-1a3b6775fd3b nodeName:}" failed. No retries permitted until 2026-02-20 07:11:34.611575189 +0000 UTC m=+1509.484201900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts") pod "root-account-create-update-5vc6z" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b") : configmap "openstack-scripts" not found Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.614733 5094 scope.go:117] "RemoveContainer" containerID="d82cb6bb5b6363e25ca9e64db1ea09c7494c05c93392e549c89755977ef55244" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.615589 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.619269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.621799 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct" (OuterVolumeSpecName: "kube-api-access-968ct") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "kube-api-access-968ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.624294 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.660003 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts" (OuterVolumeSpecName: "scripts") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.661406 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.661405 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.661536 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.98:5671: connect: connection refused" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.662816 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.663203 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.663362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config" (OuterVolumeSpecName: "config") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.666034 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb" (OuterVolumeSpecName: "kube-api-access-8fksb") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "kube-api-access-8fksb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.669193 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.693157 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6cc2-account-create-update-98lt6"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.696155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info" (OuterVolumeSpecName: "pod-info") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.713958 5094 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.713994 5094 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714006 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714016 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714026 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-rundir\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714036 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-968ct\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-kube-api-access-968ct\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714049 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714057 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fksb\" (UniqueName: \"kubernetes.io/projected/cd92d75e-9882-4bb7-a41e-cab9777424e8-kube-api-access-8fksb\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714069 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714080 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714089 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvlxh\" (UniqueName: \"kubernetes.io/projected/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6-kube-api-access-tvlxh\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714098 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd92d75e-9882-4bb7-a41e-cab9777424e8-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714128 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.714144 5094 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.727606 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.736854 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.738411 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.752603 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf" (OuterVolumeSpecName: "server-conf") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.752731 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.760833 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.766544 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.777626 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data" (OuterVolumeSpecName: "config-data") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.777680 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.792910 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "cd92d75e-9882-4bb7-a41e-cab9777424e8" (UID: "cd92d75e-9882-4bb7-a41e-cab9777424e8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818918 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818956 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818966 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818975 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.818985 5094 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd92d75e-9882-4bb7-a41e-cab9777424e8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.819001 5094 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.825993 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "219c74d6-9f45-4bf8-8c67-acdea3c0fab3" (UID: "219c74d6-9f45-4bf8-8c67-acdea3c0fab3"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.830517 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.920658 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") pod \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.920989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") pod \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\" (UID: \"0330a367-c0c9-42a9-9993-1a3b6775fd3b\") " Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.921424 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0330a367-c0c9-42a9-9993-1a3b6775fd3b" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.923588 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0330a367-c0c9-42a9-9993-1a3b6775fd3b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.923723 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/219c74d6-9f45-4bf8-8c67-acdea3c0fab3-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.926972 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw" (OuterVolumeSpecName: "kube-api-access-bfrlw") pod "0330a367-c0c9-42a9-9993-1a3b6775fd3b" (UID: "0330a367-c0c9-42a9-9993-1a3b6775fd3b"). InnerVolumeSpecName "kube-api-access-bfrlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:32 crc kubenswrapper[5094]: I0220 07:11:32.987296 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025140 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025225 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025529 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025579 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025767 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025795 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025872 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.025926 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") pod \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\" (UID: \"3d3ab399-3fc6-47e1-995c-5e855c554e9e\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.026673 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfrlw\" (UniqueName: \"kubernetes.io/projected/0330a367-c0c9-42a9-9993-1a3b6775fd3b-kube-api-access-bfrlw\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.027565 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.028031 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.028554 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.032986 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb" (OuterVolumeSpecName: "kube-api-access-b5mtb") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "kube-api-access-b5mtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.033934 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.085399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "mysql-db") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.115370 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.121084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "3d3ab399-3fc6-47e1-995c-5e855c554e9e" (UID: "3d3ab399-3fc6-47e1-995c-5e855c554e9e"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.131548 5094 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kolla-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.131585 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-generated\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132082 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132130 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132142 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132151 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d3ab399-3fc6-47e1-995c-5e855c554e9e-config-data-default\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132160 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5mtb\" (UniqueName: \"kubernetes.io/projected/3d3ab399-3fc6-47e1-995c-5e855c554e9e-kube-api-access-b5mtb\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.132171 5094 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d3ab399-3fc6-47e1-995c-5e855c554e9e-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.147220 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/keystone-55468cd684-wv6dn" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" probeResult="failure" output="Get \"https://10.217.0.151:5000/v3\": dial tcp 10.217.0.151:5000: connect: connection refused" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.162756 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.237082 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.302671 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338573 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338636 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338690 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338741 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338828 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338861 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338930 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338951 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.338978 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.339039 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") pod \"a829c6b3-7069-4544-90dc-40ae83aba524\" (UID: \"a829c6b3-7069-4544-90dc-40ae83aba524\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.339908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.342115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.343101 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.349725 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info" (OuterVolumeSpecName: "pod-info") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.362236 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.362958 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.365558 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "persistence") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.374489 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr" (OuterVolumeSpecName: "kube-api-access-fxfjr") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "kube-api-access-fxfjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.390606 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data" (OuterVolumeSpecName: "config-data") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.411599 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf" (OuterVolumeSpecName: "server-conf") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440888 5094 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a829c6b3-7069-4544-90dc-40ae83aba524-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440937 5094 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440951 5094 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440962 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a829c6b3-7069-4544-90dc-40ae83aba524-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440973 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440989 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxfjr\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-kube-api-access-fxfjr\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.440998 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.441024 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.441033 5094 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a829c6b3-7069-4544-90dc-40ae83aba524-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.441041 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.460498 5094 generic.go:334] "Generic (PLEG): container finished" podID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.460562 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5bbb9ad-deeb-495f-9750-f7012c00061d","Type":"ContainerDied","Data":"c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.462644 5094 generic.go:334] "Generic (PLEG): container finished" podID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerID="d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.462684 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468cd684-wv6dn" event={"ID":"732b4015-53b2-4422-b7d1-12b65f6e0c92","Type":"ContainerDied","Data":"d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466023 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerID="b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerDied","Data":"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466084 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d3ab399-3fc6-47e1-995c-5e855c554e9e","Type":"ContainerDied","Data":"69690ca4b7abd1bc1955c808ad93fa95a3a579fa31419e9ead102d78d2680915"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466106 5094 scope.go:117] "RemoveContainer" containerID="b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.466224 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.469072 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.472666 5094 generic.go:334] "Generic (PLEG): container finished" podID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.472870 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1504790-ccaf-42d5-a28a-a25f0cd353c9","Type":"ContainerDied","Data":"e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.477416 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5vc6z" event={"ID":"0330a367-c0c9-42a9-9993-1a3b6775fd3b","Type":"ContainerDied","Data":"09536db7b188d6d7e190e80f4a66c3a18be0cf58fe509f674089f0d4cd9626eb"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.477497 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5vc6z" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.482655 5094 generic.go:334] "Generic (PLEG): container finished" podID="a829c6b3-7069-4544-90dc-40ae83aba524" containerID="bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" exitCode=0 Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.482752 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.482757 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerDied","Data":"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.482836 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a829c6b3-7069-4544-90dc-40ae83aba524","Type":"ContainerDied","Data":"85bfead9f46b6ccf4ed616ed7699233ce7fd6b5f310fe2c58dee08099b82f9b7"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.489855 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a829c6b3-7069-4544-90dc-40ae83aba524" (UID: "a829c6b3-7069-4544-90dc-40ae83aba524"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.494262 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.494288 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.494309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"219c74d6-9f45-4bf8-8c67-acdea3c0fab3","Type":"ContainerDied","Data":"a54c1c15c6f9b215b75b1006c2e0a8430344b718fe1307994740a6fb6ec55a17"} Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.544421 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.544463 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a829c6b3-7069-4544-90dc-40ae83aba524-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.545386 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.555432 5094 scope.go:117] "RemoveContainer" containerID="c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.593394 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.604660 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5vc6z"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.620655 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.621116 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.621442 5094 scope.go:117] "RemoveContainer" containerID="b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" Feb 20 07:11:33 crc kubenswrapper[5094]: E0220 07:11:33.622565 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27\": container with ID starting with b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27 not found: ID does not exist" containerID="b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.622607 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27"} err="failed to get container status \"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27\": rpc error: code = NotFound desc = could not find container \"b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27\": container with ID starting with b512a557484cc85ddc51b76d4e4921f265f1c6139b4d9b37aa3c645ee24c7d27 not found: ID does not exist" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.622634 5094 scope.go:117] "RemoveContainer" containerID="c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25" Feb 20 07:11:33 crc kubenswrapper[5094]: E0220 07:11:33.622915 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25\": container with ID starting with c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25 not found: ID does not exist" containerID="c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.622941 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25"} err="failed to get container status \"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25\": rpc error: code = NotFound desc = could not find container \"c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25\": container with ID starting with c2fe8328d6470157d2d40189e9eccecbe7d439584278e384fea09cbd5a11eb25 not found: ID does not exist" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.622960 5094 scope.go:117] "RemoveContainer" containerID="e29fe67f4e435861ca66b7f08084602af4c30866304a951f1e2cec0fe4e96725" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.630570 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.641552 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.645818 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.645868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") pod \"a5bbb9ad-deeb-495f-9750-f7012c00061d\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.645900 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.645963 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646029 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") pod \"a5bbb9ad-deeb-495f-9750-f7012c00061d\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646095 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646128 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646150 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.646170 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.647130 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") pod \"a5bbb9ad-deeb-495f-9750-f7012c00061d\" (UID: \"a5bbb9ad-deeb-495f-9750-f7012c00061d\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.647158 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") pod \"732b4015-53b2-4422-b7d1-12b65f6e0c92\" (UID: \"732b4015-53b2-4422-b7d1-12b65f6e0c92\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.651201 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.651963 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.656491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7" (OuterVolumeSpecName: "kube-api-access-whfw7") pod "a5bbb9ad-deeb-495f-9750-f7012c00061d" (UID: "a5bbb9ad-deeb-495f-9750-f7012c00061d"). InnerVolumeSpecName "kube-api-access-whfw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.658308 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts" (OuterVolumeSpecName: "scripts") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.658437 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.662638 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l" (OuterVolumeSpecName: "kube-api-access-7dw5l") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "kube-api-access-7dw5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.674695 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.690575 5094 scope.go:117] "RemoveContainer" containerID="bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.691784 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.702870 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.703996 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5bbb9ad-deeb-495f-9750-f7012c00061d" (UID: "a5bbb9ad-deeb-495f-9750-f7012c00061d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.739454 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data" (OuterVolumeSpecName: "config-data") pod "a5bbb9ad-deeb-495f-9750-f7012c00061d" (UID: "a5bbb9ad-deeb-495f-9750-f7012c00061d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.748544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") pod \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.748660 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") pod \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.748943 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") pod \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\" (UID: \"b1504790-ccaf-42d5-a28a-a25f0cd353c9\") " Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749418 5094 scope.go:117] "RemoveContainer" containerID="24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749473 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749489 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749503 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749515 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749531 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbb9ad-deeb-495f-9750-f7012c00061d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749542 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dw5l\" (UniqueName: \"kubernetes.io/projected/732b4015-53b2-4422-b7d1-12b65f6e0c92-kube-api-access-7dw5l\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.749554 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-whfw7\" (UniqueName: \"kubernetes.io/projected/a5bbb9ad-deeb-495f-9750-f7012c00061d-kube-api-access-whfw7\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.753403 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv" (OuterVolumeSpecName: "kube-api-access-fpdrv") pod "b1504790-ccaf-42d5-a28a-a25f0cd353c9" (UID: "b1504790-ccaf-42d5-a28a-a25f0cd353c9"). InnerVolumeSpecName "kube-api-access-fpdrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.769293 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.772314 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.774136 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data" (OuterVolumeSpecName: "config-data") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.778915 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data" (OuterVolumeSpecName: "config-data") pod "b1504790-ccaf-42d5-a28a-a25f0cd353c9" (UID: "b1504790-ccaf-42d5-a28a-a25f0cd353c9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.785726 5094 scope.go:117] "RemoveContainer" containerID="bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" Feb 20 07:11:33 crc kubenswrapper[5094]: E0220 07:11:33.786285 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8\": container with ID starting with bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8 not found: ID does not exist" containerID="bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.786325 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8"} err="failed to get container status \"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8\": rpc error: code = NotFound desc = could not find container \"bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8\": container with ID starting with bf7d170cd7c0f8170ef78ab632229324322185632d477a161a83826e71f489e8 not found: ID does not exist" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.786356 5094 scope.go:117] "RemoveContainer" containerID="24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c" Feb 20 07:11:33 crc kubenswrapper[5094]: E0220 07:11:33.787371 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c\": container with ID starting with 24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c not found: ID does not exist" containerID="24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.787405 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c"} err="failed to get container status \"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c\": rpc error: code = NotFound desc = could not find container \"24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c\": container with ID starting with 24c496b6fb0954ae89602f61cc9646541943f3eefe9945f4d983f81e20a69e1c not found: ID does not exist" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.787424 5094 scope.go:117] "RemoveContainer" containerID="74e0d7c23ec3f1be5316db26c770a1e0ec492750a824549bed30f944a01c88b6" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.787550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1504790-ccaf-42d5-a28a-a25f0cd353c9" (UID: "b1504790-ccaf-42d5-a28a-a25f0cd353c9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.801278 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "732b4015-53b2-4422-b7d1-12b65f6e0c92" (UID: "732b4015-53b2-4422-b7d1-12b65f6e0c92"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.806799 5094 scope.go:117] "RemoveContainer" containerID="0b471ed3a22d48304ba36ebcf4e6acbda6b19074d6e3a72832e1dda4c6f2f145" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.822585 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.828291 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.850296 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" path="/var/lib/kubelet/pods/0330a367-c0c9-42a9-9993-1a3b6775fd3b/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851223 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" path="/var/lib/kubelet/pods/219c74d6-9f45-4bf8-8c67-acdea3c0fab3/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851630 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851654 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851674 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851687 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851714 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpdrv\" (UniqueName: \"kubernetes.io/projected/b1504790-ccaf-42d5-a28a-a25f0cd353c9-kube-api-access-fpdrv\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851725 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/732b4015-53b2-4422-b7d1-12b65f6e0c92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.851734 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1504790-ccaf-42d5-a28a-a25f0cd353c9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.853595 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" path="/var/lib/kubelet/pods/3d3ab399-3fc6-47e1-995c-5e855c554e9e/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.854401 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" path="/var/lib/kubelet/pods/5d9f1f40-92cc-4f19-9f3b-49651f56bffb/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.856660 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762a565c-672e-4127-a8c6-90f721eeda81" path="/var/lib/kubelet/pods/762a565c-672e-4127-a8c6-90f721eeda81/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.858740 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" path="/var/lib/kubelet/pods/7dd0ff85-ae3a-4035-a096-fea5952b19a7/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.859494 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" path="/var/lib/kubelet/pods/a829c6b3-7069-4544-90dc-40ae83aba524/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.860437 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6" path="/var/lib/kubelet/pods/c2f1f244-d0f0-44ca-ab5a-496ba0f59aa6/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.860805 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" path="/var/lib/kubelet/pods/cd92d75e-9882-4bb7-a41e-cab9777424e8/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.861352 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" path="/var/lib/kubelet/pods/d01cbaa4-5543-4cd5-b098-7e4600819d32/volumes" Feb 20 07:11:33 crc kubenswrapper[5094]: I0220 07:11:33.862351 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" path="/var/lib/kubelet/pods/f3caa33a-a0ec-4fdc-876b-266724a5af50/volumes" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.015672 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054014 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054085 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054141 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054187 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054222 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054303 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054337 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.054361 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") pod \"1218d679-0e51-4bef-9526-db16c8783d8b\" (UID: \"1218d679-0e51-4bef-9526-db16c8783d8b\") " Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.055016 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.055955 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.063158 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts" (OuterVolumeSpecName: "scripts") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.071961 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp" (OuterVolumeSpecName: "kube-api-access-rmtnp") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "kube-api-access-rmtnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.125797 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.138900 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.155929 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161152 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161191 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161202 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161214 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmtnp\" (UniqueName: \"kubernetes.io/projected/1218d679-0e51-4bef-9526-db16c8783d8b-kube-api-access-rmtnp\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161232 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1218d679-0e51-4bef-9526-db16c8783d8b-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161242 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.161252 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.174964 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data" (OuterVolumeSpecName: "config-data") pod "1218d679-0e51-4bef-9526-db16c8783d8b" (UID: "1218d679-0e51-4bef-9526-db16c8783d8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.262728 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1218d679-0e51-4bef-9526-db16c8783d8b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.510976 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b1504790-ccaf-42d5-a28a-a25f0cd353c9","Type":"ContainerDied","Data":"15996ec151f9ac116c6912aa4e992bb9af3fc72485808d76d5e14b93da12f57f"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.511043 5094 scope.go:117] "RemoveContainer" containerID="e21d77d7f75d6c24720a691c3dafa0724131a4a216a5cb72213f2651bb77470e" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.511002 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.524986 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a5bbb9ad-deeb-495f-9750-f7012c00061d","Type":"ContainerDied","Data":"8fedbe0d4bad6502fc4850eabe0e6dd2cb68cdc0ac0b36644634a1ea4c48d937"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.525069 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.537933 5094 scope.go:117] "RemoveContainer" containerID="c9d8bef2aa627582c2efca41433b2b24a6bb42ef156ade330dc390cb501211cf" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.543558 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.548211 5094 generic.go:334] "Generic (PLEG): container finished" podID="1218d679-0e51-4bef-9526-db16c8783d8b" containerID="b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" exitCode=0 Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.548291 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.548325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1218d679-0e51-4bef-9526-db16c8783d8b","Type":"ContainerDied","Data":"98363b14b517b7143e52cebeb4cbb5e2929e2cad785f4ba33d41cbc51fc7ecfd"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.548401 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.552860 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55468cd684-wv6dn" event={"ID":"732b4015-53b2-4422-b7d1-12b65f6e0c92","Type":"ContainerDied","Data":"0e6de3c16ee5f3004f5d74169204f09eb1abb0191c70b76d1a09ff44b2f07e6d"} Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.552923 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55468cd684-wv6dn" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.562875 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.569182 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.569236 5094 scope.go:117] "RemoveContainer" containerID="4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.584813 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.596189 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.606741 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-55468cd684-wv6dn"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.611339 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.615737 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.627243 5094 scope.go:117] "RemoveContainer" containerID="0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.649446 5094 scope.go:117] "RemoveContainer" containerID="b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.679026 5094 scope.go:117] "RemoveContainer" containerID="f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.706377 5094 scope.go:117] "RemoveContainer" containerID="4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" Feb 20 07:11:34 crc kubenswrapper[5094]: E0220 07:11:34.707323 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92\": container with ID starting with 4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92 not found: ID does not exist" containerID="4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.707382 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92"} err="failed to get container status \"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92\": rpc error: code = NotFound desc = could not find container \"4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92\": container with ID starting with 4073373c0648687276e84e986260ddd367c34f5129c119e3fce3d4726ca86b92 not found: ID does not exist" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.707417 5094 scope.go:117] "RemoveContainer" containerID="0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" Feb 20 07:11:34 crc kubenswrapper[5094]: E0220 07:11:34.708135 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a\": container with ID starting with 0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a not found: ID does not exist" containerID="0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.708184 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a"} err="failed to get container status \"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a\": rpc error: code = NotFound desc = could not find container \"0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a\": container with ID starting with 0ea214df3349d0a0166b8f5bd5c44fd7b531097f01bbd476b5f80570546b706a not found: ID does not exist" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.708226 5094 scope.go:117] "RemoveContainer" containerID="b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" Feb 20 07:11:34 crc kubenswrapper[5094]: E0220 07:11:34.708609 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c\": container with ID starting with b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c not found: ID does not exist" containerID="b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.708635 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c"} err="failed to get container status \"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c\": rpc error: code = NotFound desc = could not find container \"b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c\": container with ID starting with b54b09ff79204e43835570c3e6dd4b61080c0d99a2704ffb7699787efee4998c not found: ID does not exist" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.708653 5094 scope.go:117] "RemoveContainer" containerID="f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" Feb 20 07:11:34 crc kubenswrapper[5094]: E0220 07:11:34.708981 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4\": container with ID starting with f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4 not found: ID does not exist" containerID="f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.709003 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4"} err="failed to get container status \"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4\": rpc error: code = NotFound desc = could not find container \"f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4\": container with ID starting with f580b2199d149d5c06b5c9bfb9b0ef745161fcfd54354c074e1bf52afa2fbbe4 not found: ID does not exist" Feb 20 07:11:34 crc kubenswrapper[5094]: I0220 07:11:34.709018 5094 scope.go:117] "RemoveContainer" containerID="d6784972d1434d152307a7464021eb59a6a508fa634f71c81ae53ac06db7b051" Feb 20 07:11:35 crc kubenswrapper[5094]: I0220 07:11:35.862546 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" path="/var/lib/kubelet/pods/1218d679-0e51-4bef-9526-db16c8783d8b/volumes" Feb 20 07:11:35 crc kubenswrapper[5094]: I0220 07:11:35.863916 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" path="/var/lib/kubelet/pods/732b4015-53b2-4422-b7d1-12b65f6e0c92/volumes" Feb 20 07:11:35 crc kubenswrapper[5094]: I0220 07:11:35.864572 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" path="/var/lib/kubelet/pods/a5bbb9ad-deeb-495f-9750-f7012c00061d/volumes" Feb 20 07:11:35 crc kubenswrapper[5094]: I0220 07:11:35.869162 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" path="/var/lib/kubelet/pods/b1504790-ccaf-42d5-a28a-a25f0cd353c9/volumes" Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.925885 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.926326 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.926589 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.926619 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.927358 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.928647 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.936529 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:36 crc kubenswrapper[5094]: E0220 07:11:36.936566 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.928118 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.928355 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.932005 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.935390 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.935464 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.935492 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.937642 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:41 crc kubenswrapper[5094]: E0220 07:11:41.937719 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:42 crc kubenswrapper[5094]: I0220 07:11:42.669838 5094 generic.go:334] "Generic (PLEG): container finished" podID="530069d2-7146-46eb-9c88-056cc8a583b2" containerID="2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4" exitCode=0 Feb 20 07:11:42 crc kubenswrapper[5094]: I0220 07:11:42.669914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerDied","Data":"2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4"} Feb 20 07:11:42 crc kubenswrapper[5094]: I0220 07:11:42.841336 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:42 crc kubenswrapper[5094]: E0220 07:11:42.842158 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:42 crc kubenswrapper[5094]: I0220 07:11:42.953597 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.149934 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.150041 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.150158 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.151104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.151314 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.151538 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.151657 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") pod \"530069d2-7146-46eb-9c88-056cc8a583b2\" (UID: \"530069d2-7146-46eb-9c88-056cc8a583b2\") " Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.169752 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd" (OuterVolumeSpecName: "kube-api-access-wmlpd") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "kube-api-access-wmlpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.175928 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.211391 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.218242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.220021 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config" (OuterVolumeSpecName: "config") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.225550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.231460 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "530069d2-7146-46eb-9c88-056cc8a583b2" (UID: "530069d2-7146-46eb-9c88-056cc8a583b2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255553 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255603 5094 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255623 5094 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255645 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-config\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255666 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255685 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmlpd\" (UniqueName: \"kubernetes.io/projected/530069d2-7146-46eb-9c88-056cc8a583b2-kube-api-access-wmlpd\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.255725 5094 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/530069d2-7146-46eb-9c88-056cc8a583b2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.684758 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54bd68f77-fkqmr" event={"ID":"530069d2-7146-46eb-9c88-056cc8a583b2","Type":"ContainerDied","Data":"bd1a8216869bf24d8d0b2b58cd3351c7ec8fb6113dac3dda70cf8b0a5e81d368"} Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.684857 5094 scope.go:117] "RemoveContainer" containerID="5cb136f403a952c3f992bdf43a3607fce1325d571764a30723172fab59ca2ce3" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.684883 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54bd68f77-fkqmr" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.722004 5094 scope.go:117] "RemoveContainer" containerID="2d88bbf120f5f105b4c231e353a818cac99e32ccbc4f6e26a6b8f4bf8f8c6db4" Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.745463 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.753295 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-54bd68f77-fkqmr"] Feb 20 07:11:43 crc kubenswrapper[5094]: I0220 07:11:43.853299 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" path="/var/lib/kubelet/pods/530069d2-7146-46eb-9c88-056cc8a583b2/volumes" Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.924791 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.927093 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.927090 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.927899 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.928052 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.929861 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.933853 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:46 crc kubenswrapper[5094]: E0220 07:11:46.933932 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.924924 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.926250 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.926444 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.926747 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.926804 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.927857 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.930355 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Feb 20 07:11:51 crc kubenswrapper[5094]: E0220 07:11:51.930393 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-tj42x" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.533086 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tj42x_07969dc9-1a07-455c-b6c4-6b5f3bb23cb9/ovs-vswitchd/0.log" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.534393 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.697503 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.697640 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run" (OuterVolumeSpecName: "var-run") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698073 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698125 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698165 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698221 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib" (OuterVolumeSpecName: "var-lib") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698319 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") pod \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\" (UID: \"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9\") " Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.698353 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log" (OuterVolumeSpecName: "var-log") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699041 5094 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-etc-ovs\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699089 5094 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-log\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699106 5094 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-run\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699118 5094 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-var-lib\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.699607 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts" (OuterVolumeSpecName: "scripts") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.715192 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866" (OuterVolumeSpecName: "kube-api-access-88866") pod "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" (UID: "07969dc9-1a07-455c-b6c4-6b5f3bb23cb9"). InnerVolumeSpecName "kube-api-access-88866". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.799696 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88866\" (UniqueName: \"kubernetes.io/projected/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-kube-api-access-88866\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.799758 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.831652 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-tj42x_07969dc9-1a07-455c-b6c4-6b5f3bb23cb9/ovs-vswitchd/0.log" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832407 5094 generic.go:334] "Generic (PLEG): container finished" podID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" exitCode=137 Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832449 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerDied","Data":"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a"} Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832482 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-tj42x" event={"ID":"07969dc9-1a07-455c-b6c4-6b5f3bb23cb9","Type":"ContainerDied","Data":"6ed1fedc1eeb0edbe644ddd0c1faadaf13e34c06761ccbb5563c487421983aa5"} Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832502 5094 scope.go:117] "RemoveContainer" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.832693 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-tj42x" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.902485 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.923000 5094 scope.go:117] "RemoveContainer" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.929306 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-tj42x"] Feb 20 07:11:54 crc kubenswrapper[5094]: I0220 07:11:54.966356 5094 scope.go:117] "RemoveContainer" containerID="35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.001683 5094 scope.go:117] "RemoveContainer" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" Feb 20 07:11:55 crc kubenswrapper[5094]: E0220 07:11:55.002285 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a\": container with ID starting with 381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a not found: ID does not exist" containerID="381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.002331 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a"} err="failed to get container status \"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a\": rpc error: code = NotFound desc = could not find container \"381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a\": container with ID starting with 381757d382be9fd923f828020faf621c1e6d620cfaa9e1c29ee52df45402996a not found: ID does not exist" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.002363 5094 scope.go:117] "RemoveContainer" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" Feb 20 07:11:55 crc kubenswrapper[5094]: E0220 07:11:55.002851 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d\": container with ID starting with ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d not found: ID does not exist" containerID="ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.002915 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d"} err="failed to get container status \"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d\": rpc error: code = NotFound desc = could not find container \"ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d\": container with ID starting with ef09497839c744577bc14c1d8d7142e728fadbbae043acf75d5515f40675b66d not found: ID does not exist" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.002949 5094 scope.go:117] "RemoveContainer" containerID="35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf" Feb 20 07:11:55 crc kubenswrapper[5094]: E0220 07:11:55.003279 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf\": container with ID starting with 35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf not found: ID does not exist" containerID="35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.003307 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf"} err="failed to get container status \"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf\": rpc error: code = NotFound desc = could not find container \"35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf\": container with ID starting with 35c773a054d35ff2b9950832498531c8e991fc4664138ffa116e29aba081b5bf not found: ID does not exist" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.287806 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412301 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412793 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412865 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412918 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.412959 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") pod \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\" (UID: \"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3\") " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.413616 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock" (OuterVolumeSpecName: "lock") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.413611 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache" (OuterVolumeSpecName: "cache") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.418243 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.420950 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x" (OuterVolumeSpecName: "kube-api-access-n5g7x") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "kube-api-access-n5g7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.425827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "swift") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514236 5094 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-cache\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514456 5094 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514544 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5g7x\" (UniqueName: \"kubernetes.io/projected/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-kube-api-access-n5g7x\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514600 5094 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-lock\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.514672 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.528497 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.616733 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.654743 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" (UID: "b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.717513 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.849974 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" path="/var/lib/kubelet/pods/07969dc9-1a07-455c-b6c4-6b5f3bb23cb9/volumes" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.850696 5094 generic.go:334] "Generic (PLEG): container finished" podID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerID="694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" exitCode=137 Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.851017 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.851151 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991"} Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.852252 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3","Type":"ContainerDied","Data":"b318a2984af52699dbcc87bf8935047ececfd11a736630826d02b012b12ef5e4"} Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.852309 5094 scope.go:117] "RemoveContainer" containerID="694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.889972 5094 scope.go:117] "RemoveContainer" containerID="d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.910509 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.918560 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.928653 5094 scope.go:117] "RemoveContainer" containerID="ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.964107 5094 scope.go:117] "RemoveContainer" containerID="997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" Feb 20 07:11:55 crc kubenswrapper[5094]: I0220 07:11:55.980624 5094 scope.go:117] "RemoveContainer" containerID="c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.003127 5094 scope.go:117] "RemoveContainer" containerID="b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.021117 5094 scope.go:117] "RemoveContainer" containerID="6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.040460 5094 scope.go:117] "RemoveContainer" containerID="ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.088800 5094 scope.go:117] "RemoveContainer" containerID="da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.107506 5094 scope.go:117] "RemoveContainer" containerID="798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.124199 5094 scope.go:117] "RemoveContainer" containerID="13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.146269 5094 scope.go:117] "RemoveContainer" containerID="876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.168287 5094 scope.go:117] "RemoveContainer" containerID="87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.191116 5094 scope.go:117] "RemoveContainer" containerID="4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.210323 5094 scope.go:117] "RemoveContainer" containerID="77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.230897 5094 scope.go:117] "RemoveContainer" containerID="694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.231563 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991\": container with ID starting with 694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991 not found: ID does not exist" containerID="694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.231600 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991"} err="failed to get container status \"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991\": rpc error: code = NotFound desc = could not find container \"694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991\": container with ID starting with 694f121dca03e8e25df5cfb69cf088e987e8cad432e283e4814356d5f0848991 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.231626 5094 scope.go:117] "RemoveContainer" containerID="d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.232107 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76\": container with ID starting with d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76 not found: ID does not exist" containerID="d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.232190 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76"} err="failed to get container status \"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76\": rpc error: code = NotFound desc = could not find container \"d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76\": container with ID starting with d95b17f09f786d48faa3531b254622503fd28358bef08fa635117e335717ec76 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.232263 5094 scope.go:117] "RemoveContainer" containerID="ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.232806 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce\": container with ID starting with ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce not found: ID does not exist" containerID="ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.233190 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce"} err="failed to get container status \"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce\": rpc error: code = NotFound desc = could not find container \"ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce\": container with ID starting with ec9511e4e89788b48919a84e31eb85bf09ba8ef7f2fd3f486aacf3733fac08ce not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.233257 5094 scope.go:117] "RemoveContainer" containerID="997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.234160 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c\": container with ID starting with 997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c not found: ID does not exist" containerID="997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.234251 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c"} err="failed to get container status \"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c\": rpc error: code = NotFound desc = could not find container \"997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c\": container with ID starting with 997b5f8501f8553cfdec1b566ef6a822ad0fc847b09fa07489c9cc4bdaa9307c not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.234315 5094 scope.go:117] "RemoveContainer" containerID="c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.234717 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d\": container with ID starting with c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d not found: ID does not exist" containerID="c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.234745 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d"} err="failed to get container status \"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d\": rpc error: code = NotFound desc = could not find container \"c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d\": container with ID starting with c948d686dc04dce290ff45631530d5f46650b2f0c7f924c52cff39ae5c18105d not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.234764 5094 scope.go:117] "RemoveContainer" containerID="b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.235067 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2\": container with ID starting with b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2 not found: ID does not exist" containerID="b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.235116 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2"} err="failed to get container status \"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2\": rpc error: code = NotFound desc = could not find container \"b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2\": container with ID starting with b29a2c6ccb9cf15778496c1427d8d36bd92976f1a511632fe77d63af46238ab2 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.235144 5094 scope.go:117] "RemoveContainer" containerID="6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.235622 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601\": container with ID starting with 6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601 not found: ID does not exist" containerID="6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.235645 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601"} err="failed to get container status \"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601\": rpc error: code = NotFound desc = could not find container \"6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601\": container with ID starting with 6b2e14e49e3a0c6e84ab2a157f1e34b73c7517b9c172442d8359cdf3a71c1601 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.235657 5094 scope.go:117] "RemoveContainer" containerID="ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.235937 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3\": container with ID starting with ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3 not found: ID does not exist" containerID="ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.236009 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3"} err="failed to get container status \"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3\": rpc error: code = NotFound desc = could not find container \"ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3\": container with ID starting with ecea14d9b3d89c636aa8573b40d00132c5867dc6539f43130d1e69748ccb2dd3 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.236076 5094 scope.go:117] "RemoveContainer" containerID="da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.236654 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e\": container with ID starting with da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e not found: ID does not exist" containerID="da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.236738 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e"} err="failed to get container status \"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e\": rpc error: code = NotFound desc = could not find container \"da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e\": container with ID starting with da22674644cab0e6f810eb224627362470513debe3726c5711ef469be1db004e not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.236797 5094 scope.go:117] "RemoveContainer" containerID="798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.237149 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813\": container with ID starting with 798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813 not found: ID does not exist" containerID="798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.237199 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813"} err="failed to get container status \"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813\": rpc error: code = NotFound desc = could not find container \"798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813\": container with ID starting with 798e86534a684e476081f370e9263cf67a92cf7ab886af18bececee01f397813 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.237229 5094 scope.go:117] "RemoveContainer" containerID="13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.237569 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53\": container with ID starting with 13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53 not found: ID does not exist" containerID="13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.237595 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53"} err="failed to get container status \"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53\": rpc error: code = NotFound desc = could not find container \"13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53\": container with ID starting with 13486685a923bf2693233900e7c25d8638639ebc4de0d1f20f27c486791e4f53 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.237613 5094 scope.go:117] "RemoveContainer" containerID="876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.237971 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b\": container with ID starting with 876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b not found: ID does not exist" containerID="876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238016 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b"} err="failed to get container status \"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b\": rpc error: code = NotFound desc = could not find container \"876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b\": container with ID starting with 876ede354cfdccb940d2003cb9a673f9c21d14f390f0a6c414edee229821334b not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238046 5094 scope.go:117] "RemoveContainer" containerID="87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.238377 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a\": container with ID starting with 87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a not found: ID does not exist" containerID="87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238405 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a"} err="failed to get container status \"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a\": rpc error: code = NotFound desc = could not find container \"87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a\": container with ID starting with 87d22bfd10a9fea3abdf5984327defada78a003b41f7d6829b04b81bf140871a not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238420 5094 scope.go:117] "RemoveContainer" containerID="4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.238903 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9\": container with ID starting with 4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9 not found: ID does not exist" containerID="4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238930 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9"} err="failed to get container status \"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9\": rpc error: code = NotFound desc = could not find container \"4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9\": container with ID starting with 4eef6c37c6d679f44709f4c17ae202933f7640edde1e1c08cde85643a4c659b9 not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.238964 5094 scope.go:117] "RemoveContainer" containerID="77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.239231 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c\": container with ID starting with 77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c not found: ID does not exist" containerID="77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.239281 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c"} err="failed to get container status \"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c\": rpc error: code = NotFound desc = could not find container \"77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c\": container with ID starting with 77762b2b7b3381830ea1b52f38b5c149eed591367db41376905ffbc17a991c9c not found: ID does not exist" Feb 20 07:11:56 crc kubenswrapper[5094]: I0220 07:11:56.840220 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:11:56 crc kubenswrapper[5094]: E0220 07:11:56.840942 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:11:57 crc kubenswrapper[5094]: I0220 07:11:57.849246 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" path="/var/lib/kubelet/pods/b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3/volumes" Feb 20 07:12:08 crc kubenswrapper[5094]: I0220 07:12:08.841014 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:08 crc kubenswrapper[5094]: E0220 07:12:08.841834 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:12:22 crc kubenswrapper[5094]: I0220 07:12:22.840560 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:22 crc kubenswrapper[5094]: E0220 07:12:22.841854 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:12:36 crc kubenswrapper[5094]: I0220 07:12:36.840664 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:36 crc kubenswrapper[5094]: E0220 07:12:36.841999 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:12:47 crc kubenswrapper[5094]: I0220 07:12:47.839936 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:47 crc kubenswrapper[5094]: E0220 07:12:47.841024 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:12:58 crc kubenswrapper[5094]: I0220 07:12:58.840198 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:12:58 crc kubenswrapper[5094]: E0220 07:12:58.841860 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:13:12 crc kubenswrapper[5094]: I0220 07:13:12.841484 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:13:12 crc kubenswrapper[5094]: E0220 07:13:12.844248 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:13:27 crc kubenswrapper[5094]: I0220 07:13:27.841034 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:13:27 crc kubenswrapper[5094]: E0220 07:13:27.842447 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.716890 5094 scope.go:117] "RemoveContainer" containerID="d9a07c98406e23d72c5a2bc3d04e8964b30bc89dab757f6e64abbd3de62c1272" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.761574 5094 scope.go:117] "RemoveContainer" containerID="378b26e1e0650ae576632665d611910465c17369e442435b9765cd97f7bbf4b7" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.799572 5094 scope.go:117] "RemoveContainer" containerID="627e60d9c9677f6a945a638d36a6e55c653ad3df560faf93d76ad3fae2820334" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.829819 5094 scope.go:117] "RemoveContainer" containerID="7c8241aa612d986c2efd3e576b0082b1361568858d2a7098f35d783948c494f3" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.867131 5094 scope.go:117] "RemoveContainer" containerID="b30b7402d8eb93fe27dd4eeb5df1c58c1d66056e0ef8f55ed4b6d91fb78c16c7" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.933528 5094 scope.go:117] "RemoveContainer" containerID="75b382b7cacb24f534d4ec56b90d8492c32e928df430e4a3f769babf549e4aa5" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.957945 5094 scope.go:117] "RemoveContainer" containerID="3f7c742aada7ed25bb815042dbf9602749ab67ca6990bc8a7e5d047b638c6680" Feb 20 07:13:31 crc kubenswrapper[5094]: I0220 07:13:31.998213 5094 scope.go:117] "RemoveContainer" containerID="cef36671b09afd9d82ea9087a220ba378848b8caf63bdb34c5ff82372929ee6f" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.039445 5094 scope.go:117] "RemoveContainer" containerID="aae8dde1864d6aaeb77909eb96b990c39e162c1e7c99e123f9bcf832ed144feb" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.071139 5094 scope.go:117] "RemoveContainer" containerID="59cc73fd6408558710efa6324658cf301b0cc15eed3c78c0c37707c5d008b54e" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.103881 5094 scope.go:117] "RemoveContainer" containerID="faa3a4c7f983e78d891220a937038d927c9ef5f317cdc8774b24154cf98f3466" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.165121 5094 scope.go:117] "RemoveContainer" containerID="b09eaa7d98442981b4e7ce37eedd93a2e3a6cd66a6970eb460a847a861e69caa" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.191088 5094 scope.go:117] "RemoveContainer" containerID="c8a3057121d16618bfdbd39860a04679adcd72a905e063ca9af153f1f199e6f2" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.228213 5094 scope.go:117] "RemoveContainer" containerID="bec04cc90bea00c99a6cbeb35313f8c3b75e668698426b61e7cbceb44b686553" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.296747 5094 scope.go:117] "RemoveContainer" containerID="484f3fb839183cec10038487f86ef12f28aad48e989d27e0f371b4836997c9c1" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.323499 5094 scope.go:117] "RemoveContainer" containerID="b37e6501ae2ac6b9c9a4901b1e8b894900b9c70d4214f260a2bb15e75fba5205" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.351255 5094 scope.go:117] "RemoveContainer" containerID="a9068c6d92101548d5c981bed911d9bb32d305a4eac5c47442863a89e8e65fe9" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.384072 5094 scope.go:117] "RemoveContainer" containerID="50b02908599fab0b56ac49b8dfc4de2ac6a680f5927a195974880e894fd05f07" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.416334 5094 scope.go:117] "RemoveContainer" containerID="0ce2ed20e689517e7e40aa9a1d41aa5a5a7a220e9e24137c5b2fd77c89b217f6" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.455510 5094 scope.go:117] "RemoveContainer" containerID="5fbdea48cb9017b90d8f206860f008a7a92776227fe74ba390e642bcf9bceabc" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.477953 5094 scope.go:117] "RemoveContainer" containerID="f18af370a6085a561175edbcf861dea18a9b31989555b80d968ac0b83530959d" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.514846 5094 scope.go:117] "RemoveContainer" containerID="2a24e00dad1ec7884b816153f26b2812e819c54c4c8093cf48001992ec89df96" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.542668 5094 scope.go:117] "RemoveContainer" containerID="12ead5ac0eb56e9815a9bc9d890ea3167a7072689f354e4d8ffab5e114525ab7" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.596047 5094 scope.go:117] "RemoveContainer" containerID="652be6b8a19337e61493b053c834d7ff10c694e4819ba0dbe010139f1aba0575" Feb 20 07:13:32 crc kubenswrapper[5094]: I0220 07:13:32.621559 5094 scope.go:117] "RemoveContainer" containerID="ba5029a86f52015ae26ae9c4af241df191f71a5df81010d7bab393d3d450c913" Feb 20 07:13:38 crc kubenswrapper[5094]: I0220 07:13:38.841176 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:13:38 crc kubenswrapper[5094]: E0220 07:13:38.842089 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:13:52 crc kubenswrapper[5094]: I0220 07:13:52.841414 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:13:52 crc kubenswrapper[5094]: E0220 07:13:52.842329 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:14:06 crc kubenswrapper[5094]: I0220 07:14:06.841070 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:14:06 crc kubenswrapper[5094]: E0220 07:14:06.842621 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:14:21 crc kubenswrapper[5094]: I0220 07:14:21.840449 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:14:21 crc kubenswrapper[5094]: E0220 07:14:21.841077 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.126406 5094 scope.go:117] "RemoveContainer" containerID="8e52dfc26f0d8eee3b5547850bbe5713b909b7c78a2c9a2bf1d9cded5250f6d8" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.184830 5094 scope.go:117] "RemoveContainer" containerID="8c0a9ed7e7a708519d24ca670cb89abe814680e00b43aecd0756535bb04fcacc" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.222370 5094 scope.go:117] "RemoveContainer" containerID="2e821859400e46f67b2a4c22980c8831875df8a723826e86fdac889d47a73a82" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.292167 5094 scope.go:117] "RemoveContainer" containerID="d8a84b3a4264179ed1969ed49bb2b66b1b9094412f8430af00f5618a35872170" Feb 20 07:14:33 crc kubenswrapper[5094]: I0220 07:14:33.840867 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:14:33 crc kubenswrapper[5094]: E0220 07:14:33.841287 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:14:46 crc kubenswrapper[5094]: I0220 07:14:46.840936 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:14:46 crc kubenswrapper[5094]: E0220 07:14:46.842325 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.166139 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167807 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167832 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167859 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167872 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167885 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167898 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167920 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167933 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167959 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="galera" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167971 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="galera" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.167987 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="cinder-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.167999 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="cinder-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168021 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168033 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168047 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168059 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168072 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168084 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168104 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="sg-core" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168119 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="sg-core" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168138 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168151 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168169 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168182 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168198 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-central-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168210 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-central-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168232 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168245 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168266 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168278 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168314 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168326 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168349 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168364 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-server" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168386 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168399 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168424 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168437 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168451 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168463 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168476 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="rsync" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168487 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="rsync" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168509 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168521 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168542 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168554 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168569 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168581 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168597 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168609 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168623 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerName="memcached" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168634 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerName="memcached" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168665 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168678 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168691 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168732 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="setup-container" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168757 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="setup-container" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168779 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168793 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168818 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168829 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-api" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168852 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="openstack-network-exporter" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168865 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="openstack-network-exporter" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168879 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168891 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168907 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168919 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168936 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168951 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.168967 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.168980 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-server" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169000 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server-init" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169013 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server-init" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169034 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="mysql-bootstrap" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169046 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="mysql-bootstrap" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169060 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169073 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169095 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="probe" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169106 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="probe" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169121 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169134 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169149 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-notification-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169165 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-notification-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169185 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169197 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169214 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="setup-container" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169226 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="setup-container" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169245 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169258 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169274 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="swift-recon-cron" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169286 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="swift-recon-cron" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169317 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169329 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-server" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169343 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe9db54-4204-4335-a272-c469e0923478" containerName="kube-state-metrics" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169356 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe9db54-4204-4335-a272-c469e0923478" containerName="kube-state-metrics" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169377 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-reaper" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169390 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-reaper" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169403 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169416 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169432 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169444 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169458 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169470 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169489 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-expirer" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169501 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-expirer" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169520 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169532 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169544 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169560 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169578 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169590 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.169611 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169623 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169901 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169920 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovs-vswitchd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169933 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169946 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169965 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169982 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.169998 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a829c6b3-7069-4544-90dc-40ae83aba524" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170014 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170030 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170048 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170067 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170082 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170098 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170115 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="openstack-network-exporter" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170131 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="rsync" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170150 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1f9ca7-ba3b-47c1-bb1f-a3280135c4e4" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170164 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="219c74d6-9f45-4bf8-8c67-acdea3c0fab3" containerName="rabbitmq" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170177 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="762a565c-672e-4127-a8c6-90f721eeda81" containerName="glance-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170194 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="07969dc9-1a07-455c-b6c4-6b5f3bb23cb9" containerName="ovsdb-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170211 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170230 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170246 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="proxy-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170269 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170291 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170312 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="732b4015-53b2-4422-b7d1-12b65f6e0c92" containerName="keystone-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170326 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8cb333-2939-4404-b242-67bcf4e6875b" containerName="cinder-api" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170345 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3caa33a-a0ec-4fdc-876b-266724a5af50" containerName="nova-cell1-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170359 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd92d75e-9882-4bb7-a41e-cab9777424e8" containerName="ovn-northd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170376 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170391 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-server" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170417 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="swift-recon-cron" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170432 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170450 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="908e2706-d24f-41c9-b481-4c0d5415c5ca" containerName="barbican-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170463 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-central-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170477 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="sg-core" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170493 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9f1f40-92cc-4f19-9f3b-49651f56bffb" containerName="barbican-keystone-listener" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170505 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="530069d2-7146-46eb-9c88-056cc8a583b2" containerName="neutron-httpd" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170521 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bbb9ad-deeb-495f-9750-f7012c00061d" containerName="nova-cell0-conductor-conductor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170534 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="92877559-6960-4dbf-890a-fb563f4b0bf8" containerName="barbican-worker" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170550 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="cinder-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170567 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-expirer" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170584 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd0ff85-ae3a-4035-a096-fea5952b19a7" containerName="memcached" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170598 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3ab399-3fc6-47e1-995c-5e855c554e9e" containerName="galera" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170613 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="container-replicator" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170626 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1504790-ccaf-42d5-a28a-a25f0cd353c9" containerName="nova-scheduler-scheduler" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170645 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1218d679-0e51-4bef-9526-db16c8783d8b" containerName="ceilometer-notification-agent" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170664 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170678 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-auditor" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170697 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="object-updater" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170743 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d01cbaa4-5543-4cd5-b098-7e4600819d32" containerName="probe" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170760 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe9db54-4204-4335-a272-c469e0923478" containerName="kube-state-metrics" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170779 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11aa87b-3964-4a62-871f-bdf7d1ad7848" containerName="nova-metadata-metadata" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170800 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca3adffb-7baf-45db-ab16-cc1c63510fec" containerName="nova-api-log" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.170839 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cb140a-6a34-42b3-ac3f-6ce5e03a42c3" containerName="account-reaper" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.171635 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.175167 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.176461 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.189336 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.286226 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.286477 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.286580 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.387956 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.388094 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.388146 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.389100 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.399598 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.414302 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") pod \"collect-profiles-29526195-mm4vs\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.501035 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.779255 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 07:15:00 crc kubenswrapper[5094]: I0220 07:15:00.841433 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:00 crc kubenswrapper[5094]: E0220 07:15:00.841991 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:01 crc kubenswrapper[5094]: I0220 07:15:01.025859 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" event={"ID":"74085586-b345-46e6-9367-d3b5243312a4","Type":"ContainerStarted","Data":"c476764b7efa3f5032f50b9bb9a9da3e47b6999aab2c3a50d607379d62e94686"} Feb 20 07:15:01 crc kubenswrapper[5094]: I0220 07:15:01.025926 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" event={"ID":"74085586-b345-46e6-9367-d3b5243312a4","Type":"ContainerStarted","Data":"c03a4bde583c80e36a47cfc23ed3195915e04446ab8cdd2137f1d6df7125216f"} Feb 20 07:15:01 crc kubenswrapper[5094]: I0220 07:15:01.064364 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" podStartSLOduration=1.06433217 podStartE2EDuration="1.06433217s" podCreationTimestamp="2026-02-20 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:15:01.056519482 +0000 UTC m=+1715.929146203" watchObservedRunningTime="2026-02-20 07:15:01.06433217 +0000 UTC m=+1715.936958881" Feb 20 07:15:02 crc kubenswrapper[5094]: I0220 07:15:02.038746 5094 generic.go:334] "Generic (PLEG): container finished" podID="74085586-b345-46e6-9367-d3b5243312a4" containerID="c476764b7efa3f5032f50b9bb9a9da3e47b6999aab2c3a50d607379d62e94686" exitCode=0 Feb 20 07:15:02 crc kubenswrapper[5094]: I0220 07:15:02.038919 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" event={"ID":"74085586-b345-46e6-9367-d3b5243312a4","Type":"ContainerDied","Data":"c476764b7efa3f5032f50b9bb9a9da3e47b6999aab2c3a50d607379d62e94686"} Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.455223 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.554339 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") pod \"74085586-b345-46e6-9367-d3b5243312a4\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.554447 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") pod \"74085586-b345-46e6-9367-d3b5243312a4\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.554513 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") pod \"74085586-b345-46e6-9367-d3b5243312a4\" (UID: \"74085586-b345-46e6-9367-d3b5243312a4\") " Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.555417 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume" (OuterVolumeSpecName: "config-volume") pod "74085586-b345-46e6-9367-d3b5243312a4" (UID: "74085586-b345-46e6-9367-d3b5243312a4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.561954 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4" (OuterVolumeSpecName: "kube-api-access-5lrb4") pod "74085586-b345-46e6-9367-d3b5243312a4" (UID: "74085586-b345-46e6-9367-d3b5243312a4"). InnerVolumeSpecName "kube-api-access-5lrb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.563247 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "74085586-b345-46e6-9367-d3b5243312a4" (UID: "74085586-b345-46e6-9367-d3b5243312a4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.656240 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lrb4\" (UniqueName: \"kubernetes.io/projected/74085586-b345-46e6-9367-d3b5243312a4-kube-api-access-5lrb4\") on node \"crc\" DevicePath \"\"" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.656272 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/74085586-b345-46e6-9367-d3b5243312a4-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:15:03 crc kubenswrapper[5094]: I0220 07:15:03.656284 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/74085586-b345-46e6-9367-d3b5243312a4-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:15:04 crc kubenswrapper[5094]: I0220 07:15:04.063443 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" event={"ID":"74085586-b345-46e6-9367-d3b5243312a4","Type":"ContainerDied","Data":"c03a4bde583c80e36a47cfc23ed3195915e04446ab8cdd2137f1d6df7125216f"} Feb 20 07:15:04 crc kubenswrapper[5094]: I0220 07:15:04.063512 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03a4bde583c80e36a47cfc23ed3195915e04446ab8cdd2137f1d6df7125216f" Feb 20 07:15:04 crc kubenswrapper[5094]: I0220 07:15:04.063594 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs" Feb 20 07:15:14 crc kubenswrapper[5094]: I0220 07:15:14.844754 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:14 crc kubenswrapper[5094]: E0220 07:15:14.846338 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:25 crc kubenswrapper[5094]: I0220 07:15:25.849342 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:25 crc kubenswrapper[5094]: E0220 07:15:25.850998 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.419993 5094 scope.go:117] "RemoveContainer" containerID="1616ce3dec6e69416e60df4bd5866fdeda9b06b012b17e0a3ceddd1a84ee25c5" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.461634 5094 scope.go:117] "RemoveContainer" containerID="87c896e6058562619d75ebeaacab8cf5ef47914c9fd0a960cbfc01098b412e02" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.512624 5094 scope.go:117] "RemoveContainer" containerID="514774461bdf76f918de93cfcbabf0b67e1bca119186db34bd24f1a423cf7e05" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.544983 5094 scope.go:117] "RemoveContainer" containerID="f74e9fd620d60bb7c55d8ca9b94a45b983b355d7aa77e6d394eb827e69cef1af" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.578616 5094 scope.go:117] "RemoveContainer" containerID="6d3b6790676924518ae410dca9464ac17e8adb8be7c1c0809abd3e37c9afadec" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.635650 5094 scope.go:117] "RemoveContainer" containerID="a7698ff70c057c639a740972f6d9a16bf4a7bb2202bca0a958050e7b6bf16b01" Feb 20 07:15:33 crc kubenswrapper[5094]: I0220 07:15:33.698280 5094 scope.go:117] "RemoveContainer" containerID="5c648081449bf27866a97d9ce26a3b96bf18b0b2aa6ccd28cfd7c308a8ad471b" Feb 20 07:15:36 crc kubenswrapper[5094]: I0220 07:15:36.841344 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:36 crc kubenswrapper[5094]: E0220 07:15:36.842676 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:47 crc kubenswrapper[5094]: I0220 07:15:47.840496 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:47 crc kubenswrapper[5094]: E0220 07:15:47.842034 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:15:59 crc kubenswrapper[5094]: I0220 07:15:59.840208 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:15:59 crc kubenswrapper[5094]: E0220 07:15:59.841051 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:16:13 crc kubenswrapper[5094]: I0220 07:16:13.841621 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:16:14 crc kubenswrapper[5094]: I0220 07:16:14.931448 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df"} Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.610551 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:23 crc kubenswrapper[5094]: E0220 07:16:23.612412 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74085586-b345-46e6-9367-d3b5243312a4" containerName="collect-profiles" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.612432 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="74085586-b345-46e6-9367-d3b5243312a4" containerName="collect-profiles" Feb 20 07:16:23 crc kubenswrapper[5094]: E0220 07:16:23.612465 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.612474 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0330a367-c0c9-42a9-9993-1a3b6775fd3b" containerName="mariadb-account-create-update" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.612665 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="74085586-b345-46e6-9367-d3b5243312a4" containerName="collect-profiles" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.614313 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.644536 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.704640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.704930 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.705181 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.806770 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.806867 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.806922 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.807412 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.807563 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.832791 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") pod \"redhat-operators-fxgmt\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:23 crc kubenswrapper[5094]: I0220 07:16:23.942064 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:24 crc kubenswrapper[5094]: I0220 07:16:24.333087 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:25 crc kubenswrapper[5094]: I0220 07:16:25.028996 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerID="839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240" exitCode=0 Feb 20 07:16:25 crc kubenswrapper[5094]: I0220 07:16:25.029069 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerDied","Data":"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240"} Feb 20 07:16:25 crc kubenswrapper[5094]: I0220 07:16:25.029148 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerStarted","Data":"2aba90f111620ddf577aab25aaa8798b21a91cff9c03d53b756128d812853f94"} Feb 20 07:16:25 crc kubenswrapper[5094]: I0220 07:16:25.032799 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.045038 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerStarted","Data":"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e"} Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.607558 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.610591 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.649293 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.669016 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.669095 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.669151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.770637 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.770761 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.770796 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.771593 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.771595 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.806409 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") pod \"community-operators-bpgwq\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:26 crc kubenswrapper[5094]: I0220 07:16:26.929402 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:27 crc kubenswrapper[5094]: I0220 07:16:27.082539 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerID="08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e" exitCode=0 Feb 20 07:16:27 crc kubenswrapper[5094]: I0220 07:16:27.082596 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerDied","Data":"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e"} Feb 20 07:16:27 crc kubenswrapper[5094]: I0220 07:16:27.245265 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.100849 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerStarted","Data":"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521"} Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.102965 5094 generic.go:334] "Generic (PLEG): container finished" podID="edef768c-5542-4656-a22c-61559bee852a" containerID="445c378b533d8cda17dde1a3261e7c35527956eb919d6b68028d1be9060fcba6" exitCode=0 Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.103067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerDied","Data":"445c378b533d8cda17dde1a3261e7c35527956eb919d6b68028d1be9060fcba6"} Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.103138 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerStarted","Data":"7c54b7f653705f50f4a7768ba3bc601f95148cb9ef2c08121132b654c1dacb31"} Feb 20 07:16:28 crc kubenswrapper[5094]: I0220 07:16:28.140795 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fxgmt" podStartSLOduration=2.694275211 podStartE2EDuration="5.140768737s" podCreationTimestamp="2026-02-20 07:16:23 +0000 UTC" firstStartedPulling="2026-02-20 07:16:25.032509259 +0000 UTC m=+1799.905135970" lastFinishedPulling="2026-02-20 07:16:27.479002785 +0000 UTC m=+1802.351629496" observedRunningTime="2026-02-20 07:16:28.139316212 +0000 UTC m=+1803.011942943" watchObservedRunningTime="2026-02-20 07:16:28.140768737 +0000 UTC m=+1803.013395458" Feb 20 07:16:29 crc kubenswrapper[5094]: I0220 07:16:29.114484 5094 generic.go:334] "Generic (PLEG): container finished" podID="edef768c-5542-4656-a22c-61559bee852a" containerID="86d63d89448664d061af9377adb2ecbd115c5a9e9c513ff7ecb7c1633f03b880" exitCode=0 Feb 20 07:16:29 crc kubenswrapper[5094]: I0220 07:16:29.114607 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerDied","Data":"86d63d89448664d061af9377adb2ecbd115c5a9e9c513ff7ecb7c1633f03b880"} Feb 20 07:16:30 crc kubenswrapper[5094]: I0220 07:16:30.125123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerStarted","Data":"4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301"} Feb 20 07:16:30 crc kubenswrapper[5094]: I0220 07:16:30.148111 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bpgwq" podStartSLOduration=2.77341787 podStartE2EDuration="4.14808542s" podCreationTimestamp="2026-02-20 07:16:26 +0000 UTC" firstStartedPulling="2026-02-20 07:16:28.105929631 +0000 UTC m=+1802.978556342" lastFinishedPulling="2026-02-20 07:16:29.480597161 +0000 UTC m=+1804.353223892" observedRunningTime="2026-02-20 07:16:30.141786368 +0000 UTC m=+1805.014413079" watchObservedRunningTime="2026-02-20 07:16:30.14808542 +0000 UTC m=+1805.020712161" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.888785 5094 scope.go:117] "RemoveContainer" containerID="ef54747f742ee43a493f0fa824a214fb6267fc5169fba7034a316b9cd14cd93e" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.929195 5094 scope.go:117] "RemoveContainer" containerID="20a41a106027a121889419317d3bee9adb413422fa7d51549986b6bb65338151" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.944119 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.944984 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:33 crc kubenswrapper[5094]: I0220 07:16:33.995019 5094 scope.go:117] "RemoveContainer" containerID="f1120a67726b82c35452578214de053dba930e494ccdb044a45f2d20acdc8e01" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.023191 5094 scope.go:117] "RemoveContainer" containerID="f4dba9092d5d591bec2aa8645bd29d545b24d0697fb34f266f49cf7c75f5e2e2" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.070282 5094 scope.go:117] "RemoveContainer" containerID="f58aa2d0199fdb5db167c5aef1bd053858fb706513548fbeaab67aa8e11ddcd5" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.096637 5094 scope.go:117] "RemoveContainer" containerID="3b84d0ad082d2f55f39f713582ba0b77187948e8fdabcb2633a95c7f3cd76484" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.120577 5094 scope.go:117] "RemoveContainer" containerID="8fc9ffae76a3123cc0d37835acdbe04a2f9a8e499d724ece0380a09483cd0561" Feb 20 07:16:34 crc kubenswrapper[5094]: I0220 07:16:34.147105 5094 scope.go:117] "RemoveContainer" containerID="1dec1e1d1106c53f0481c59fed514648b5c6714553079cbf04bed733473a6862" Feb 20 07:16:35 crc kubenswrapper[5094]: I0220 07:16:35.015090 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fxgmt" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" probeResult="failure" output=< Feb 20 07:16:35 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:16:35 crc kubenswrapper[5094]: > Feb 20 07:16:36 crc kubenswrapper[5094]: I0220 07:16:36.930370 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:36 crc kubenswrapper[5094]: I0220 07:16:36.930852 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:37 crc kubenswrapper[5094]: I0220 07:16:37.002300 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:37 crc kubenswrapper[5094]: I0220 07:16:37.242693 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:40 crc kubenswrapper[5094]: I0220 07:16:40.766741 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:40 crc kubenswrapper[5094]: I0220 07:16:40.768613 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bpgwq" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="registry-server" containerID="cri-o://4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301" gracePeriod=2 Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.231862 5094 generic.go:334] "Generic (PLEG): container finished" podID="edef768c-5542-4656-a22c-61559bee852a" containerID="4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301" exitCode=0 Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.231976 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerDied","Data":"4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301"} Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.301017 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.350902 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") pod \"edef768c-5542-4656-a22c-61559bee852a\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.351023 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") pod \"edef768c-5542-4656-a22c-61559bee852a\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.351142 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") pod \"edef768c-5542-4656-a22c-61559bee852a\" (UID: \"edef768c-5542-4656-a22c-61559bee852a\") " Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.352731 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities" (OuterVolumeSpecName: "utilities") pod "edef768c-5542-4656-a22c-61559bee852a" (UID: "edef768c-5542-4656-a22c-61559bee852a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.361273 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl" (OuterVolumeSpecName: "kube-api-access-p2fnl") pod "edef768c-5542-4656-a22c-61559bee852a" (UID: "edef768c-5542-4656-a22c-61559bee852a"). InnerVolumeSpecName "kube-api-access-p2fnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.419503 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "edef768c-5542-4656-a22c-61559bee852a" (UID: "edef768c-5542-4656-a22c-61559bee852a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.454048 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2fnl\" (UniqueName: \"kubernetes.io/projected/edef768c-5542-4656-a22c-61559bee852a-kube-api-access-p2fnl\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.454112 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:41 crc kubenswrapper[5094]: I0220 07:16:41.454131 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/edef768c-5542-4656-a22c-61559bee852a-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.249768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bpgwq" event={"ID":"edef768c-5542-4656-a22c-61559bee852a","Type":"ContainerDied","Data":"7c54b7f653705f50f4a7768ba3bc601f95148cb9ef2c08121132b654c1dacb31"} Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.249887 5094 scope.go:117] "RemoveContainer" containerID="4dfed148a8d236fc70447485880962be98056daf836753eac14634d846d47301" Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.250215 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bpgwq" Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.299854 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.314080 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bpgwq"] Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.315994 5094 scope.go:117] "RemoveContainer" containerID="86d63d89448664d061af9377adb2ecbd115c5a9e9c513ff7ecb7c1633f03b880" Feb 20 07:16:42 crc kubenswrapper[5094]: I0220 07:16:42.346526 5094 scope.go:117] "RemoveContainer" containerID="445c378b533d8cda17dde1a3261e7c35527956eb919d6b68028d1be9060fcba6" Feb 20 07:16:43 crc kubenswrapper[5094]: I0220 07:16:43.859051 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edef768c-5542-4656-a22c-61559bee852a" path="/var/lib/kubelet/pods/edef768c-5542-4656-a22c-61559bee852a/volumes" Feb 20 07:16:44 crc kubenswrapper[5094]: I0220 07:16:44.006505 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:44 crc kubenswrapper[5094]: I0220 07:16:44.074182 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:44 crc kubenswrapper[5094]: I0220 07:16:44.968849 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.278851 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fxgmt" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" containerID="cri-o://372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" gracePeriod=2 Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.865244 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.936418 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") pod \"fc330ce9-f173-403a-a659-c3c7326a8ae5\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.936561 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") pod \"fc330ce9-f173-403a-a659-c3c7326a8ae5\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.936674 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") pod \"fc330ce9-f173-403a-a659-c3c7326a8ae5\" (UID: \"fc330ce9-f173-403a-a659-c3c7326a8ae5\") " Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.937653 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities" (OuterVolumeSpecName: "utilities") pod "fc330ce9-f173-403a-a659-c3c7326a8ae5" (UID: "fc330ce9-f173-403a-a659-c3c7326a8ae5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:16:45 crc kubenswrapper[5094]: I0220 07:16:45.947334 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q" (OuterVolumeSpecName: "kube-api-access-kfj9q") pod "fc330ce9-f173-403a-a659-c3c7326a8ae5" (UID: "fc330ce9-f173-403a-a659-c3c7326a8ae5"). InnerVolumeSpecName "kube-api-access-kfj9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.039613 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.039668 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfj9q\" (UniqueName: \"kubernetes.io/projected/fc330ce9-f173-403a-a659-c3c7326a8ae5-kube-api-access-kfj9q\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.125827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc330ce9-f173-403a-a659-c3c7326a8ae5" (UID: "fc330ce9-f173-403a-a659-c3c7326a8ae5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.141403 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc330ce9-f173-403a-a659-c3c7326a8ae5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293434 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerID="372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" exitCode=0 Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293535 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerDied","Data":"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521"} Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293545 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxgmt" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293587 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxgmt" event={"ID":"fc330ce9-f173-403a-a659-c3c7326a8ae5","Type":"ContainerDied","Data":"2aba90f111620ddf577aab25aaa8798b21a91cff9c03d53b756128d812853f94"} Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.293628 5094 scope.go:117] "RemoveContainer" containerID="372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.336138 5094 scope.go:117] "RemoveContainer" containerID="08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.367565 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.375069 5094 scope.go:117] "RemoveContainer" containerID="839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.380888 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fxgmt"] Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.410689 5094 scope.go:117] "RemoveContainer" containerID="372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" Feb 20 07:16:46 crc kubenswrapper[5094]: E0220 07:16:46.411198 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521\": container with ID starting with 372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521 not found: ID does not exist" containerID="372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.411382 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521"} err="failed to get container status \"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521\": rpc error: code = NotFound desc = could not find container \"372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521\": container with ID starting with 372985f5d8ae355c4b45a4d003d44e7bcabe22f67b2da53047bd46bbae4ca521 not found: ID does not exist" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.411431 5094 scope.go:117] "RemoveContainer" containerID="08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e" Feb 20 07:16:46 crc kubenswrapper[5094]: E0220 07:16:46.412022 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e\": container with ID starting with 08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e not found: ID does not exist" containerID="08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.412055 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e"} err="failed to get container status \"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e\": rpc error: code = NotFound desc = could not find container \"08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e\": container with ID starting with 08cf71aa7ef86aa96ab4a725e3a786c9f214125d1e6aaebd458da7b349136d2e not found: ID does not exist" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.412075 5094 scope.go:117] "RemoveContainer" containerID="839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240" Feb 20 07:16:46 crc kubenswrapper[5094]: E0220 07:16:46.412358 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240\": container with ID starting with 839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240 not found: ID does not exist" containerID="839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240" Feb 20 07:16:46 crc kubenswrapper[5094]: I0220 07:16:46.412387 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240"} err="failed to get container status \"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240\": rpc error: code = NotFound desc = could not find container \"839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240\": container with ID starting with 839c0f255954bd2ecc5395fae7edd7e494e820f2caf00dccf86f613d6f0aa240 not found: ID does not exist" Feb 20 07:16:47 crc kubenswrapper[5094]: I0220 07:16:47.855610 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" path="/var/lib/kubelet/pods/fc330ce9-f173-403a-a659-c3c7326a8ae5/volumes" Feb 20 07:18:34 crc kubenswrapper[5094]: I0220 07:18:34.107665 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:18:34 crc kubenswrapper[5094]: I0220 07:18:34.108662 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:19:04 crc kubenswrapper[5094]: I0220 07:19:04.107357 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:19:04 crc kubenswrapper[5094]: I0220 07:19:04.108553 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.572867 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574449 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="extract-content" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574481 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="extract-content" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574510 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="extract-utilities" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574527 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="extract-utilities" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574558 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="extract-utilities" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574578 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="extract-utilities" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574617 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="extract-content" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574634 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="extract-content" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574663 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574680 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: E0220 07:19:27.574748 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.574766 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.575082 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc330ce9-f173-403a-a659-c3c7326a8ae5" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.575112 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="edef768c-5542-4656-a22c-61559bee852a" containerName="registry-server" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.577012 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.613751 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.662160 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.662257 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.662336 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.764360 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.764446 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.764510 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.765465 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.765915 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.790115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") pod \"redhat-marketplace-97lzn\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:27 crc kubenswrapper[5094]: I0220 07:19:27.915260 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:28 crc kubenswrapper[5094]: I0220 07:19:28.167234 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:29 crc kubenswrapper[5094]: I0220 07:19:29.016824 5094 generic.go:334] "Generic (PLEG): container finished" podID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerID="919dfcc7f250bd3e3599094bc494141651def23aa2461960fe0fad45431e684c" exitCode=0 Feb 20 07:19:29 crc kubenswrapper[5094]: I0220 07:19:29.017130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerDied","Data":"919dfcc7f250bd3e3599094bc494141651def23aa2461960fe0fad45431e684c"} Feb 20 07:19:29 crc kubenswrapper[5094]: I0220 07:19:29.017388 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerStarted","Data":"83c2439b2b22253659c1167a4f6763f69e16ecce665b8c681073b07ed24d0fda"} Feb 20 07:19:30 crc kubenswrapper[5094]: I0220 07:19:30.026392 5094 generic.go:334] "Generic (PLEG): container finished" podID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerID="f2fc254af1f6f42e8ded1fcc5cc5bf7d80c82de1e467dd8a5830d513f713bbd6" exitCode=0 Feb 20 07:19:30 crc kubenswrapper[5094]: I0220 07:19:30.026538 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerDied","Data":"f2fc254af1f6f42e8ded1fcc5cc5bf7d80c82de1e467dd8a5830d513f713bbd6"} Feb 20 07:19:31 crc kubenswrapper[5094]: I0220 07:19:31.035481 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerStarted","Data":"fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647"} Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.107193 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.107832 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.107911 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.109046 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:19:34 crc kubenswrapper[5094]: I0220 07:19:34.109189 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df" gracePeriod=600 Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.077665 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df" exitCode=0 Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.077720 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df"} Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.078523 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942"} Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.078565 5094 scope.go:117] "RemoveContainer" containerID="7e0de49971e77b5c012df2ac39c43ac03799b8d63c2a62bec73e3cbad7043310" Feb 20 07:19:35 crc kubenswrapper[5094]: I0220 07:19:35.102995 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-97lzn" podStartSLOduration=6.718236209 podStartE2EDuration="8.102892999s" podCreationTimestamp="2026-02-20 07:19:27 +0000 UTC" firstStartedPulling="2026-02-20 07:19:29.019419173 +0000 UTC m=+1983.892045924" lastFinishedPulling="2026-02-20 07:19:30.404075973 +0000 UTC m=+1985.276702714" observedRunningTime="2026-02-20 07:19:31.061048259 +0000 UTC m=+1985.933674970" watchObservedRunningTime="2026-02-20 07:19:35.102892999 +0000 UTC m=+1989.975519710" Feb 20 07:19:37 crc kubenswrapper[5094]: I0220 07:19:37.916382 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:37 crc kubenswrapper[5094]: I0220 07:19:37.916891 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:37 crc kubenswrapper[5094]: I0220 07:19:37.989251 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:38 crc kubenswrapper[5094]: I0220 07:19:38.178782 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:38 crc kubenswrapper[5094]: I0220 07:19:38.254059 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:40 crc kubenswrapper[5094]: I0220 07:19:40.127907 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-97lzn" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="registry-server" containerID="cri-o://fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647" gracePeriod=2 Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143064 5094 generic.go:334] "Generic (PLEG): container finished" podID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerID="fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647" exitCode=0 Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143134 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerDied","Data":"fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647"} Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143520 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-97lzn" event={"ID":"97983ca9-1559-417f-9d3d-876f4dc9301a","Type":"ContainerDied","Data":"83c2439b2b22253659c1167a4f6763f69e16ecce665b8c681073b07ed24d0fda"} Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143544 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83c2439b2b22253659c1167a4f6763f69e16ecce665b8c681073b07ed24d0fda" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.143157 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.329254 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") pod \"97983ca9-1559-417f-9d3d-876f4dc9301a\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.329357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") pod \"97983ca9-1559-417f-9d3d-876f4dc9301a\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.329404 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") pod \"97983ca9-1559-417f-9d3d-876f4dc9301a\" (UID: \"97983ca9-1559-417f-9d3d-876f4dc9301a\") " Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.331024 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities" (OuterVolumeSpecName: "utilities") pod "97983ca9-1559-417f-9d3d-876f4dc9301a" (UID: "97983ca9-1559-417f-9d3d-876f4dc9301a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.342067 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62" (OuterVolumeSpecName: "kube-api-access-4dn62") pod "97983ca9-1559-417f-9d3d-876f4dc9301a" (UID: "97983ca9-1559-417f-9d3d-876f4dc9301a"). InnerVolumeSpecName "kube-api-access-4dn62". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.361906 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "97983ca9-1559-417f-9d3d-876f4dc9301a" (UID: "97983ca9-1559-417f-9d3d-876f4dc9301a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.432480 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.433031 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/97983ca9-1559-417f-9d3d-876f4dc9301a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:19:41 crc kubenswrapper[5094]: I0220 07:19:41.433226 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dn62\" (UniqueName: \"kubernetes.io/projected/97983ca9-1559-417f-9d3d-876f4dc9301a-kube-api-access-4dn62\") on node \"crc\" DevicePath \"\"" Feb 20 07:19:42 crc kubenswrapper[5094]: I0220 07:19:42.152446 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-97lzn" Feb 20 07:19:42 crc kubenswrapper[5094]: I0220 07:19:42.177592 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:42 crc kubenswrapper[5094]: I0220 07:19:42.193114 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-97lzn"] Feb 20 07:19:43 crc kubenswrapper[5094]: I0220 07:19:43.850041 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" path="/var/lib/kubelet/pods/97983ca9-1559-417f-9d3d-876f4dc9301a/volumes" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.462882 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:05 crc kubenswrapper[5094]: E0220 07:21:05.464303 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="registry-server" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.464327 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="registry-server" Feb 20 07:21:05 crc kubenswrapper[5094]: E0220 07:21:05.464386 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="extract-content" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.464400 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="extract-content" Feb 20 07:21:05 crc kubenswrapper[5094]: E0220 07:21:05.464437 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="extract-utilities" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.464453 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="extract-utilities" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.464765 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="97983ca9-1559-417f-9d3d-876f4dc9301a" containerName="registry-server" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.466785 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.491089 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.557471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.557618 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.557815 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.659933 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.660122 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.660158 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.660798 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.661258 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.681778 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") pod \"certified-operators-2nsvk\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:05 crc kubenswrapper[5094]: I0220 07:21:05.803979 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:06 crc kubenswrapper[5094]: I0220 07:21:06.093663 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:06 crc kubenswrapper[5094]: I0220 07:21:06.985366 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerID="cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d" exitCode=0 Feb 20 07:21:06 crc kubenswrapper[5094]: I0220 07:21:06.985489 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerDied","Data":"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d"} Feb 20 07:21:06 crc kubenswrapper[5094]: I0220 07:21:06.985973 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerStarted","Data":"3f86ead75ecd7c62888f8f57bd59f88b3baa96e25bd27fb0c40378046f28994b"} Feb 20 07:21:08 crc kubenswrapper[5094]: I0220 07:21:08.000017 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerStarted","Data":"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa"} Feb 20 07:21:09 crc kubenswrapper[5094]: I0220 07:21:09.027633 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerID="19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa" exitCode=0 Feb 20 07:21:09 crc kubenswrapper[5094]: I0220 07:21:09.027748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerDied","Data":"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa"} Feb 20 07:21:10 crc kubenswrapper[5094]: I0220 07:21:10.038779 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerStarted","Data":"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5"} Feb 20 07:21:10 crc kubenswrapper[5094]: I0220 07:21:10.066969 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2nsvk" podStartSLOduration=2.617735178 podStartE2EDuration="5.066939731s" podCreationTimestamp="2026-02-20 07:21:05 +0000 UTC" firstStartedPulling="2026-02-20 07:21:06.989136402 +0000 UTC m=+2081.861763123" lastFinishedPulling="2026-02-20 07:21:09.438340965 +0000 UTC m=+2084.310967676" observedRunningTime="2026-02-20 07:21:10.062797122 +0000 UTC m=+2084.935423853" watchObservedRunningTime="2026-02-20 07:21:10.066939731 +0000 UTC m=+2084.939566452" Feb 20 07:21:15 crc kubenswrapper[5094]: I0220 07:21:15.804921 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:15 crc kubenswrapper[5094]: I0220 07:21:15.805545 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:15 crc kubenswrapper[5094]: I0220 07:21:15.890966 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:16 crc kubenswrapper[5094]: I0220 07:21:16.151312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:16 crc kubenswrapper[5094]: I0220 07:21:16.222245 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.115234 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2nsvk" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="registry-server" containerID="cri-o://9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" gracePeriod=2 Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.563377 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.627788 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") pod \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.627909 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") pod \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.627955 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") pod \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\" (UID: \"5a1b1538-58d2-448e-8a39-9ec2dac98a3e\") " Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.629792 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities" (OuterVolumeSpecName: "utilities") pod "5a1b1538-58d2-448e-8a39-9ec2dac98a3e" (UID: "5a1b1538-58d2-448e-8a39-9ec2dac98a3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.637999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52" (OuterVolumeSpecName: "kube-api-access-fdz52") pod "5a1b1538-58d2-448e-8a39-9ec2dac98a3e" (UID: "5a1b1538-58d2-448e-8a39-9ec2dac98a3e"). InnerVolumeSpecName "kube-api-access-fdz52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.728954 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.728990 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdz52\" (UniqueName: \"kubernetes.io/projected/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-kube-api-access-fdz52\") on node \"crc\" DevicePath \"\"" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.849137 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a1b1538-58d2-448e-8a39-9ec2dac98a3e" (UID: "5a1b1538-58d2-448e-8a39-9ec2dac98a3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:21:18 crc kubenswrapper[5094]: I0220 07:21:18.932210 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a1b1538-58d2-448e-8a39-9ec2dac98a3e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131039 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerID="9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" exitCode=0 Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131136 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerDied","Data":"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5"} Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131198 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2nsvk" event={"ID":"5a1b1538-58d2-448e-8a39-9ec2dac98a3e","Type":"ContainerDied","Data":"3f86ead75ecd7c62888f8f57bd59f88b3baa96e25bd27fb0c40378046f28994b"} Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131229 5094 scope.go:117] "RemoveContainer" containerID="9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.131457 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2nsvk" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.164538 5094 scope.go:117] "RemoveContainer" containerID="19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.197617 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.230747 5094 scope.go:117] "RemoveContainer" containerID="cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.233260 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2nsvk"] Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.259979 5094 scope.go:117] "RemoveContainer" containerID="9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" Feb 20 07:21:19 crc kubenswrapper[5094]: E0220 07:21:19.260810 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5\": container with ID starting with 9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5 not found: ID does not exist" containerID="9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.260868 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5"} err="failed to get container status \"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5\": rpc error: code = NotFound desc = could not find container \"9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5\": container with ID starting with 9b0e2a585d47175ec5335934394eaca3a88ae9406ac2a5bc56d7d5d3cb0894d5 not found: ID does not exist" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.260909 5094 scope.go:117] "RemoveContainer" containerID="19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa" Feb 20 07:21:19 crc kubenswrapper[5094]: E0220 07:21:19.261640 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa\": container with ID starting with 19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa not found: ID does not exist" containerID="19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.261697 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa"} err="failed to get container status \"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa\": rpc error: code = NotFound desc = could not find container \"19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa\": container with ID starting with 19cc6b58ad6d5203a786adb8549d85e9adfa140f0a07d4288e451077fe611ffa not found: ID does not exist" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.261778 5094 scope.go:117] "RemoveContainer" containerID="cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d" Feb 20 07:21:19 crc kubenswrapper[5094]: E0220 07:21:19.262211 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d\": container with ID starting with cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d not found: ID does not exist" containerID="cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.262252 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d"} err="failed to get container status \"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d\": rpc error: code = NotFound desc = could not find container \"cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d\": container with ID starting with cd368ec9f4e03f9946c8fa62094be1a97b1488bfc915477e1dc38ac037b9642d not found: ID does not exist" Feb 20 07:21:19 crc kubenswrapper[5094]: I0220 07:21:19.853131 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" path="/var/lib/kubelet/pods/5a1b1538-58d2-448e-8a39-9ec2dac98a3e/volumes" Feb 20 07:21:34 crc kubenswrapper[5094]: I0220 07:21:34.107872 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:21:34 crc kubenswrapper[5094]: I0220 07:21:34.108848 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:22:04 crc kubenswrapper[5094]: I0220 07:22:04.106642 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:22:04 crc kubenswrapper[5094]: I0220 07:22:04.107810 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.106525 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.107424 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.107506 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.108589 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.108699 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" gracePeriod=600 Feb 20 07:22:34 crc kubenswrapper[5094]: E0220 07:22:34.237370 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.946032 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" exitCode=0 Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.946166 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942"} Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.946783 5094 scope.go:117] "RemoveContainer" containerID="38e73fba76bbd62d29e5f1f0936fd251ad4017c1b3ae50e879b97b18fd5f82df" Feb 20 07:22:34 crc kubenswrapper[5094]: I0220 07:22:34.948734 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:22:34 crc kubenswrapper[5094]: E0220 07:22:34.949331 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:22:46 crc kubenswrapper[5094]: I0220 07:22:46.840382 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:22:46 crc kubenswrapper[5094]: E0220 07:22:46.841483 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:00 crc kubenswrapper[5094]: I0220 07:23:00.840993 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:00 crc kubenswrapper[5094]: E0220 07:23:00.841986 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:11 crc kubenswrapper[5094]: I0220 07:23:11.840737 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:11 crc kubenswrapper[5094]: E0220 07:23:11.842892 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:25 crc kubenswrapper[5094]: I0220 07:23:25.849674 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:25 crc kubenswrapper[5094]: E0220 07:23:25.851536 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:36 crc kubenswrapper[5094]: I0220 07:23:36.840938 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:36 crc kubenswrapper[5094]: E0220 07:23:36.844734 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:23:47 crc kubenswrapper[5094]: I0220 07:23:47.841276 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:23:47 crc kubenswrapper[5094]: E0220 07:23:47.842539 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:01 crc kubenswrapper[5094]: I0220 07:24:01.840582 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:01 crc kubenswrapper[5094]: E0220 07:24:01.843583 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:14 crc kubenswrapper[5094]: I0220 07:24:14.840424 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:14 crc kubenswrapper[5094]: E0220 07:24:14.841304 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:29 crc kubenswrapper[5094]: I0220 07:24:29.840858 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:29 crc kubenswrapper[5094]: E0220 07:24:29.841844 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:41 crc kubenswrapper[5094]: I0220 07:24:41.840551 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:41 crc kubenswrapper[5094]: E0220 07:24:41.841957 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:24:56 crc kubenswrapper[5094]: I0220 07:24:56.841506 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:24:56 crc kubenswrapper[5094]: E0220 07:24:56.842830 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:08 crc kubenswrapper[5094]: I0220 07:25:08.841595 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:08 crc kubenswrapper[5094]: E0220 07:25:08.842603 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:21 crc kubenswrapper[5094]: I0220 07:25:21.841584 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:21 crc kubenswrapper[5094]: E0220 07:25:21.842987 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:34 crc kubenswrapper[5094]: I0220 07:25:34.546767 5094 scope.go:117] "RemoveContainer" containerID="919dfcc7f250bd3e3599094bc494141651def23aa2461960fe0fad45431e684c" Feb 20 07:25:34 crc kubenswrapper[5094]: I0220 07:25:34.599518 5094 scope.go:117] "RemoveContainer" containerID="f2fc254af1f6f42e8ded1fcc5cc5bf7d80c82de1e467dd8a5830d513f713bbd6" Feb 20 07:25:34 crc kubenswrapper[5094]: I0220 07:25:34.643216 5094 scope.go:117] "RemoveContainer" containerID="fdc32efb3bc3d4c0e97cd8e6e4eebccb3a9a3a5fb028c32c4814a64109c85647" Feb 20 07:25:34 crc kubenswrapper[5094]: I0220 07:25:34.841070 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:34 crc kubenswrapper[5094]: E0220 07:25:34.842209 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:46 crc kubenswrapper[5094]: I0220 07:25:46.841019 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:46 crc kubenswrapper[5094]: E0220 07:25:46.842143 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:25:57 crc kubenswrapper[5094]: I0220 07:25:57.843447 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:25:57 crc kubenswrapper[5094]: E0220 07:25:57.845294 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:10 crc kubenswrapper[5094]: I0220 07:26:10.841010 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:10 crc kubenswrapper[5094]: E0220 07:26:10.842221 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:22 crc kubenswrapper[5094]: I0220 07:26:22.841896 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:22 crc kubenswrapper[5094]: E0220 07:26:22.845764 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.694288 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:32 crc kubenswrapper[5094]: E0220 07:26:32.695374 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="extract-utilities" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.695394 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="extract-utilities" Feb 20 07:26:32 crc kubenswrapper[5094]: E0220 07:26:32.695437 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="extract-content" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.695452 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="extract-content" Feb 20 07:26:32 crc kubenswrapper[5094]: E0220 07:26:32.695472 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="registry-server" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.695481 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="registry-server" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.695670 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a1b1538-58d2-448e-8a39-9ec2dac98a3e" containerName="registry-server" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.697160 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.715146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.715224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.715299 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.715938 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.817615 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.817735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.817803 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.818242 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.818599 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:32 crc kubenswrapper[5094]: I0220 07:26:32.845998 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") pod \"redhat-operators-h7xgp\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:33 crc kubenswrapper[5094]: I0220 07:26:33.020884 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:33 crc kubenswrapper[5094]: I0220 07:26:33.478968 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:33 crc kubenswrapper[5094]: I0220 07:26:33.841551 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:33 crc kubenswrapper[5094]: E0220 07:26:33.842359 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:34 crc kubenswrapper[5094]: I0220 07:26:34.455309 5094 generic.go:334] "Generic (PLEG): container finished" podID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerID="2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9" exitCode=0 Feb 20 07:26:34 crc kubenswrapper[5094]: I0220 07:26:34.455387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerDied","Data":"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9"} Feb 20 07:26:34 crc kubenswrapper[5094]: I0220 07:26:34.455462 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerStarted","Data":"9d79db86dab6a517e10d93843c540d48efb57163ba63e6df9bf20cab17a7726a"} Feb 20 07:26:34 crc kubenswrapper[5094]: I0220 07:26:34.458092 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:26:35 crc kubenswrapper[5094]: I0220 07:26:35.468968 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerStarted","Data":"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01"} Feb 20 07:26:36 crc kubenswrapper[5094]: I0220 07:26:36.481571 5094 generic.go:334] "Generic (PLEG): container finished" podID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerID="aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01" exitCode=0 Feb 20 07:26:36 crc kubenswrapper[5094]: I0220 07:26:36.481651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerDied","Data":"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01"} Feb 20 07:26:37 crc kubenswrapper[5094]: I0220 07:26:37.495024 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerStarted","Data":"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24"} Feb 20 07:26:37 crc kubenswrapper[5094]: I0220 07:26:37.533652 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h7xgp" podStartSLOduration=3.11470225 podStartE2EDuration="5.533625635s" podCreationTimestamp="2026-02-20 07:26:32 +0000 UTC" firstStartedPulling="2026-02-20 07:26:34.457813416 +0000 UTC m=+2409.330440127" lastFinishedPulling="2026-02-20 07:26:36.876736801 +0000 UTC m=+2411.749363512" observedRunningTime="2026-02-20 07:26:37.531101975 +0000 UTC m=+2412.403728706" watchObservedRunningTime="2026-02-20 07:26:37.533625635 +0000 UTC m=+2412.406252376" Feb 20 07:26:43 crc kubenswrapper[5094]: I0220 07:26:43.021558 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:43 crc kubenswrapper[5094]: I0220 07:26:43.022077 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:44 crc kubenswrapper[5094]: I0220 07:26:44.086222 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h7xgp" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" probeResult="failure" output=< Feb 20 07:26:44 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:26:44 crc kubenswrapper[5094]: > Feb 20 07:26:45 crc kubenswrapper[5094]: I0220 07:26:45.850633 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:45 crc kubenswrapper[5094]: E0220 07:26:45.850958 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:26:53 crc kubenswrapper[5094]: I0220 07:26:53.107889 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:53 crc kubenswrapper[5094]: I0220 07:26:53.167758 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:53 crc kubenswrapper[5094]: I0220 07:26:53.359744 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:54 crc kubenswrapper[5094]: I0220 07:26:54.670367 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h7xgp" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" containerID="cri-o://bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" gracePeriod=2 Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.130297 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.170639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") pod \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.170822 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") pod \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.170936 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") pod \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\" (UID: \"8d9d8b9a-3617-4295-bea3-3339d6dc7b88\") " Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.173298 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities" (OuterVolumeSpecName: "utilities") pod "8d9d8b9a-3617-4295-bea3-3339d6dc7b88" (UID: "8d9d8b9a-3617-4295-bea3-3339d6dc7b88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.192043 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc" (OuterVolumeSpecName: "kube-api-access-b2gsc") pod "8d9d8b9a-3617-4295-bea3-3339d6dc7b88" (UID: "8d9d8b9a-3617-4295-bea3-3339d6dc7b88"). InnerVolumeSpecName "kube-api-access-b2gsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.272233 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.272272 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2gsc\" (UniqueName: \"kubernetes.io/projected/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-kube-api-access-b2gsc\") on node \"crc\" DevicePath \"\"" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.415589 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d9d8b9a-3617-4295-bea3-3339d6dc7b88" (UID: "8d9d8b9a-3617-4295-bea3-3339d6dc7b88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.476651 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d9d8b9a-3617-4295-bea3-3339d6dc7b88-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679300 5094 generic.go:334] "Generic (PLEG): container finished" podID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerID="bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" exitCode=0 Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679348 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerDied","Data":"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24"} Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679383 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h7xgp" event={"ID":"8d9d8b9a-3617-4295-bea3-3339d6dc7b88","Type":"ContainerDied","Data":"9d79db86dab6a517e10d93843c540d48efb57163ba63e6df9bf20cab17a7726a"} Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679403 5094 scope.go:117] "RemoveContainer" containerID="bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.679438 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h7xgp" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.702462 5094 scope.go:117] "RemoveContainer" containerID="aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.719506 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.724397 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h7xgp"] Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.729494 5094 scope.go:117] "RemoveContainer" containerID="2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.757790 5094 scope.go:117] "RemoveContainer" containerID="bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" Feb 20 07:26:55 crc kubenswrapper[5094]: E0220 07:26:55.758266 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24\": container with ID starting with bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24 not found: ID does not exist" containerID="bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.758346 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24"} err="failed to get container status \"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24\": rpc error: code = NotFound desc = could not find container \"bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24\": container with ID starting with bb4ba2daf05a507d629a231d540f975a77bf3c296d57ebb79a07ed3f7ff01b24 not found: ID does not exist" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.758392 5094 scope.go:117] "RemoveContainer" containerID="aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01" Feb 20 07:26:55 crc kubenswrapper[5094]: E0220 07:26:55.759090 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01\": container with ID starting with aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01 not found: ID does not exist" containerID="aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.759156 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01"} err="failed to get container status \"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01\": rpc error: code = NotFound desc = could not find container \"aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01\": container with ID starting with aab72c0eb6de3e2be159e5463fec369c127364c1f8c91363519fd5c5dc0a8c01 not found: ID does not exist" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.759197 5094 scope.go:117] "RemoveContainer" containerID="2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9" Feb 20 07:26:55 crc kubenswrapper[5094]: E0220 07:26:55.759593 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9\": container with ID starting with 2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9 not found: ID does not exist" containerID="2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.759640 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9"} err="failed to get container status \"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9\": rpc error: code = NotFound desc = could not find container \"2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9\": container with ID starting with 2f0a1b6aa6ccdc312d21765181ced4b72706d82dcb309f3bea989ecd2d0916c9 not found: ID does not exist" Feb 20 07:26:55 crc kubenswrapper[5094]: I0220 07:26:55.851811 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" path="/var/lib/kubelet/pods/8d9d8b9a-3617-4295-bea3-3339d6dc7b88/volumes" Feb 20 07:26:58 crc kubenswrapper[5094]: I0220 07:26:58.841411 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:26:58 crc kubenswrapper[5094]: E0220 07:26:58.842560 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:27:12 crc kubenswrapper[5094]: I0220 07:27:12.841134 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:27:12 crc kubenswrapper[5094]: E0220 07:27:12.842461 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:27:23 crc kubenswrapper[5094]: I0220 07:27:23.840991 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:27:23 crc kubenswrapper[5094]: E0220 07:27:23.842583 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.261660 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:27 crc kubenswrapper[5094]: E0220 07:27:27.263215 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="extract-utilities" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.263246 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="extract-utilities" Feb 20 07:27:27 crc kubenswrapper[5094]: E0220 07:27:27.263283 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.263293 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" Feb 20 07:27:27 crc kubenswrapper[5094]: E0220 07:27:27.263317 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="extract-content" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.263327 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="extract-content" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.263553 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d9d8b9a-3617-4295-bea3-3339d6dc7b88" containerName="registry-server" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.265395 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.293054 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.466219 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.466294 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.466378 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.567949 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.568007 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.568079 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.568728 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.569033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.600540 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") pod \"community-operators-p5hgb\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:27 crc kubenswrapper[5094]: I0220 07:27:27.887033 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:28 crc kubenswrapper[5094]: I0220 07:27:28.191848 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:29 crc kubenswrapper[5094]: I0220 07:27:29.094965 5094 generic.go:334] "Generic (PLEG): container finished" podID="9110846b-830e-4e8f-a471-34dea42873ff" containerID="ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd" exitCode=0 Feb 20 07:27:29 crc kubenswrapper[5094]: I0220 07:27:29.095431 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerDied","Data":"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd"} Feb 20 07:27:29 crc kubenswrapper[5094]: I0220 07:27:29.095478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerStarted","Data":"3888aa249479a0fe72c5da8b03cb1d89510858b5eefdfb659b65a7a6857906df"} Feb 20 07:27:30 crc kubenswrapper[5094]: I0220 07:27:30.108513 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerStarted","Data":"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278"} Feb 20 07:27:31 crc kubenswrapper[5094]: I0220 07:27:31.123801 5094 generic.go:334] "Generic (PLEG): container finished" podID="9110846b-830e-4e8f-a471-34dea42873ff" containerID="01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278" exitCode=0 Feb 20 07:27:31 crc kubenswrapper[5094]: I0220 07:27:31.123886 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerDied","Data":"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278"} Feb 20 07:27:32 crc kubenswrapper[5094]: I0220 07:27:32.140169 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerStarted","Data":"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3"} Feb 20 07:27:32 crc kubenswrapper[5094]: I0220 07:27:32.175042 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p5hgb" podStartSLOduration=2.735628534 podStartE2EDuration="5.17501731s" podCreationTimestamp="2026-02-20 07:27:27 +0000 UTC" firstStartedPulling="2026-02-20 07:27:29.099450285 +0000 UTC m=+2463.972077046" lastFinishedPulling="2026-02-20 07:27:31.538839091 +0000 UTC m=+2466.411465822" observedRunningTime="2026-02-20 07:27:32.168915913 +0000 UTC m=+2467.041542644" watchObservedRunningTime="2026-02-20 07:27:32.17501731 +0000 UTC m=+2467.047644031" Feb 20 07:27:35 crc kubenswrapper[5094]: I0220 07:27:35.848602 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:27:36 crc kubenswrapper[5094]: I0220 07:27:36.186854 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc"} Feb 20 07:27:37 crc kubenswrapper[5094]: I0220 07:27:37.888234 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:37 crc kubenswrapper[5094]: I0220 07:27:37.889427 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:37 crc kubenswrapper[5094]: I0220 07:27:37.982609 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:38 crc kubenswrapper[5094]: I0220 07:27:38.274288 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:38 crc kubenswrapper[5094]: I0220 07:27:38.386253 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.228382 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p5hgb" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="registry-server" containerID="cri-o://4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" gracePeriod=2 Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.836278 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.930422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") pod \"9110846b-830e-4e8f-a471-34dea42873ff\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.930532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") pod \"9110846b-830e-4e8f-a471-34dea42873ff\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.930763 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") pod \"9110846b-830e-4e8f-a471-34dea42873ff\" (UID: \"9110846b-830e-4e8f-a471-34dea42873ff\") " Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.932449 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities" (OuterVolumeSpecName: "utilities") pod "9110846b-830e-4e8f-a471-34dea42873ff" (UID: "9110846b-830e-4e8f-a471-34dea42873ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:27:40 crc kubenswrapper[5094]: I0220 07:27:40.940954 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx" (OuterVolumeSpecName: "kube-api-access-szkzx") pod "9110846b-830e-4e8f-a471-34dea42873ff" (UID: "9110846b-830e-4e8f-a471-34dea42873ff"). InnerVolumeSpecName "kube-api-access-szkzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.017899 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9110846b-830e-4e8f-a471-34dea42873ff" (UID: "9110846b-830e-4e8f-a471-34dea42873ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.033908 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.033981 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szkzx\" (UniqueName: \"kubernetes.io/projected/9110846b-830e-4e8f-a471-34dea42873ff-kube-api-access-szkzx\") on node \"crc\" DevicePath \"\"" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.034002 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9110846b-830e-4e8f-a471-34dea42873ff-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.241796 5094 generic.go:334] "Generic (PLEG): container finished" podID="9110846b-830e-4e8f-a471-34dea42873ff" containerID="4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" exitCode=0 Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.241874 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerDied","Data":"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3"} Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.242427 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p5hgb" event={"ID":"9110846b-830e-4e8f-a471-34dea42873ff","Type":"ContainerDied","Data":"3888aa249479a0fe72c5da8b03cb1d89510858b5eefdfb659b65a7a6857906df"} Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.242473 5094 scope.go:117] "RemoveContainer" containerID="4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.242055 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p5hgb" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.296208 5094 scope.go:117] "RemoveContainer" containerID="01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.310772 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.314924 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p5hgb"] Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.331183 5094 scope.go:117] "RemoveContainer" containerID="ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.369228 5094 scope.go:117] "RemoveContainer" containerID="4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" Feb 20 07:27:41 crc kubenswrapper[5094]: E0220 07:27:41.369775 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3\": container with ID starting with 4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3 not found: ID does not exist" containerID="4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.369835 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3"} err="failed to get container status \"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3\": rpc error: code = NotFound desc = could not find container \"4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3\": container with ID starting with 4b4bb133e97c256b64947d6bad0eab28408ddc4503007b44ed04ed730a6277b3 not found: ID does not exist" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.369876 5094 scope.go:117] "RemoveContainer" containerID="01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278" Feb 20 07:27:41 crc kubenswrapper[5094]: E0220 07:27:41.370546 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278\": container with ID starting with 01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278 not found: ID does not exist" containerID="01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.370609 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278"} err="failed to get container status \"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278\": rpc error: code = NotFound desc = could not find container \"01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278\": container with ID starting with 01faa754532731ea409f577ed20393ac8371b02a53270a34ed7d9796c8811278 not found: ID does not exist" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.370634 5094 scope.go:117] "RemoveContainer" containerID="ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd" Feb 20 07:27:41 crc kubenswrapper[5094]: E0220 07:27:41.371574 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd\": container with ID starting with ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd not found: ID does not exist" containerID="ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.371615 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd"} err="failed to get container status \"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd\": rpc error: code = NotFound desc = could not find container \"ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd\": container with ID starting with ddb97b2ad2e49c250ac96b91b9d112739dd6263d370a29d6fe43a819df35affd not found: ID does not exist" Feb 20 07:27:41 crc kubenswrapper[5094]: I0220 07:27:41.857818 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9110846b-830e-4e8f-a471-34dea42873ff" path="/var/lib/kubelet/pods/9110846b-830e-4e8f-a471-34dea42873ff/volumes" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.175081 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 07:30:00 crc kubenswrapper[5094]: E0220 07:30:00.176456 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="registry-server" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.176480 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="registry-server" Feb 20 07:30:00 crc kubenswrapper[5094]: E0220 07:30:00.176523 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="extract-content" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.176538 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="extract-content" Feb 20 07:30:00 crc kubenswrapper[5094]: E0220 07:30:00.176565 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="extract-utilities" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.176578 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="extract-utilities" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.177313 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9110846b-830e-4e8f-a471-34dea42873ff" containerName="registry-server" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.178914 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.183803 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.186275 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.211355 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.212816 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.212909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.213153 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.315241 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.315332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.315461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.317282 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.331091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.339967 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") pod \"collect-profiles-29526210-hwdfw\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:00 crc kubenswrapper[5094]: I0220 07:30:00.525623 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:01 crc kubenswrapper[5094]: I0220 07:30:01.045050 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 07:30:01 crc kubenswrapper[5094]: I0220 07:30:01.748808 5094 generic.go:334] "Generic (PLEG): container finished" podID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" containerID="07ea8e807e5436859467c750ef51269844eba966788ab09e687b71868fdd8b31" exitCode=0 Feb 20 07:30:01 crc kubenswrapper[5094]: I0220 07:30:01.748934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" event={"ID":"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d","Type":"ContainerDied","Data":"07ea8e807e5436859467c750ef51269844eba966788ab09e687b71868fdd8b31"} Feb 20 07:30:01 crc kubenswrapper[5094]: I0220 07:30:01.749203 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" event={"ID":"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d","Type":"ContainerStarted","Data":"75598ee0137b474f71854526287c35843ae94a087c015ae47398f1b6cd665787"} Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.149319 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.281836 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") pod \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.281928 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") pod \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.282087 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") pod \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\" (UID: \"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d\") " Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.283529 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume" (OuterVolumeSpecName: "config-volume") pod "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" (UID: "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.290496 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw" (OuterVolumeSpecName: "kube-api-access-mg8sw") pod "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" (UID: "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d"). InnerVolumeSpecName "kube-api-access-mg8sw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.291972 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" (UID: "e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.385041 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg8sw\" (UniqueName: \"kubernetes.io/projected/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-kube-api-access-mg8sw\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.385124 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.385146 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.780104 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" event={"ID":"e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d","Type":"ContainerDied","Data":"75598ee0137b474f71854526287c35843ae94a087c015ae47398f1b6cd665787"} Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.780183 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75598ee0137b474f71854526287c35843ae94a087c015ae47398f1b6cd665787" Feb 20 07:30:03 crc kubenswrapper[5094]: I0220 07:30:03.780259 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw" Feb 20 07:30:04 crc kubenswrapper[5094]: I0220 07:30:04.106655 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:30:04 crc kubenswrapper[5094]: I0220 07:30:04.107898 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:30:04 crc kubenswrapper[5094]: I0220 07:30:04.248005 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 07:30:04 crc kubenswrapper[5094]: I0220 07:30:04.256931 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526165-tdww4"] Feb 20 07:30:05 crc kubenswrapper[5094]: I0220 07:30:05.859080 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="806ba791-714c-4d13-b595-d4f6ccf06aea" path="/var/lib/kubelet/pods/806ba791-714c-4d13-b595-d4f6ccf06aea/volumes" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.479921 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:23 crc kubenswrapper[5094]: E0220 07:30:23.481401 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" containerName="collect-profiles" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.481435 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" containerName="collect-profiles" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.481763 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" containerName="collect-profiles" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.487334 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.505838 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.593464 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.593544 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.593571 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.696461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.696557 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.696776 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.698028 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.698047 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.724917 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") pod \"redhat-marketplace-zb9lv\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:23 crc kubenswrapper[5094]: I0220 07:30:23.810328 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:24 crc kubenswrapper[5094]: I0220 07:30:24.164863 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:25 crc kubenswrapper[5094]: I0220 07:30:25.017898 5094 generic.go:334] "Generic (PLEG): container finished" podID="3427dbbc-6f50-4576-9871-52bd5a127484" containerID="856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855" exitCode=0 Feb 20 07:30:25 crc kubenswrapper[5094]: I0220 07:30:25.018273 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerDied","Data":"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855"} Feb 20 07:30:25 crc kubenswrapper[5094]: I0220 07:30:25.018408 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerStarted","Data":"64477b0fb18d73d2b6d5e2bce6c185ea99cd22a3e25012ecc5859d23259c3a85"} Feb 20 07:30:26 crc kubenswrapper[5094]: I0220 07:30:26.026439 5094 generic.go:334] "Generic (PLEG): container finished" podID="3427dbbc-6f50-4576-9871-52bd5a127484" containerID="6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295" exitCode=0 Feb 20 07:30:26 crc kubenswrapper[5094]: I0220 07:30:26.026588 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerDied","Data":"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295"} Feb 20 07:30:27 crc kubenswrapper[5094]: I0220 07:30:27.041215 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerStarted","Data":"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42"} Feb 20 07:30:27 crc kubenswrapper[5094]: I0220 07:30:27.071910 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zb9lv" podStartSLOduration=2.532102259 podStartE2EDuration="4.071883548s" podCreationTimestamp="2026-02-20 07:30:23 +0000 UTC" firstStartedPulling="2026-02-20 07:30:25.020881948 +0000 UTC m=+2639.893508699" lastFinishedPulling="2026-02-20 07:30:26.560663267 +0000 UTC m=+2641.433289988" observedRunningTime="2026-02-20 07:30:27.069098681 +0000 UTC m=+2641.941725392" watchObservedRunningTime="2026-02-20 07:30:27.071883548 +0000 UTC m=+2641.944510269" Feb 20 07:30:33 crc kubenswrapper[5094]: I0220 07:30:33.811263 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:33 crc kubenswrapper[5094]: I0220 07:30:33.812231 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:33 crc kubenswrapper[5094]: I0220 07:30:33.882652 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.107221 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.107548 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.146502 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.216096 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:34 crc kubenswrapper[5094]: I0220 07:30:34.842275 5094 scope.go:117] "RemoveContainer" containerID="97bfbe799eec96272d8e9a6b7afb335847d35846686c6b7385ed8fc78aa4aec5" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.111045 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zb9lv" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="registry-server" containerID="cri-o://5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" gracePeriod=2 Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.588617 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.724734 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") pod \"3427dbbc-6f50-4576-9871-52bd5a127484\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.724885 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") pod \"3427dbbc-6f50-4576-9871-52bd5a127484\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.724927 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") pod \"3427dbbc-6f50-4576-9871-52bd5a127484\" (UID: \"3427dbbc-6f50-4576-9871-52bd5a127484\") " Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.725859 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities" (OuterVolumeSpecName: "utilities") pod "3427dbbc-6f50-4576-9871-52bd5a127484" (UID: "3427dbbc-6f50-4576-9871-52bd5a127484"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.732436 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44" (OuterVolumeSpecName: "kube-api-access-qxl44") pod "3427dbbc-6f50-4576-9871-52bd5a127484" (UID: "3427dbbc-6f50-4576-9871-52bd5a127484"). InnerVolumeSpecName "kube-api-access-qxl44". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.748189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3427dbbc-6f50-4576-9871-52bd5a127484" (UID: "3427dbbc-6f50-4576-9871-52bd5a127484"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.826484 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxl44\" (UniqueName: \"kubernetes.io/projected/3427dbbc-6f50-4576-9871-52bd5a127484-kube-api-access-qxl44\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.827664 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:36 crc kubenswrapper[5094]: I0220 07:30:36.827688 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3427dbbc-6f50-4576-9871-52bd5a127484-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.121275 5094 generic.go:334] "Generic (PLEG): container finished" podID="3427dbbc-6f50-4576-9871-52bd5a127484" containerID="5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" exitCode=0 Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.121351 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerDied","Data":"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42"} Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.121384 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zb9lv" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.121889 5094 scope.go:117] "RemoveContainer" containerID="5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.135867 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zb9lv" event={"ID":"3427dbbc-6f50-4576-9871-52bd5a127484","Type":"ContainerDied","Data":"64477b0fb18d73d2b6d5e2bce6c185ea99cd22a3e25012ecc5859d23259c3a85"} Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.166387 5094 scope.go:117] "RemoveContainer" containerID="6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.168866 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.178144 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zb9lv"] Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.192226 5094 scope.go:117] "RemoveContainer" containerID="856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.221467 5094 scope.go:117] "RemoveContainer" containerID="5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" Feb 20 07:30:37 crc kubenswrapper[5094]: E0220 07:30:37.221993 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42\": container with ID starting with 5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42 not found: ID does not exist" containerID="5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.222028 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42"} err="failed to get container status \"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42\": rpc error: code = NotFound desc = could not find container \"5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42\": container with ID starting with 5e1745b21ed11314dd9bd34e5fd2f91c250a53111792f9a404cce5cba1f01f42 not found: ID does not exist" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.222052 5094 scope.go:117] "RemoveContainer" containerID="6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295" Feb 20 07:30:37 crc kubenswrapper[5094]: E0220 07:30:37.222406 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295\": container with ID starting with 6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295 not found: ID does not exist" containerID="6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.222479 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295"} err="failed to get container status \"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295\": rpc error: code = NotFound desc = could not find container \"6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295\": container with ID starting with 6db3a6fda00d983f944b7c17a0874460f1560455d19e5117c4d0307427156295 not found: ID does not exist" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.222520 5094 scope.go:117] "RemoveContainer" containerID="856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855" Feb 20 07:30:37 crc kubenswrapper[5094]: E0220 07:30:37.222966 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855\": container with ID starting with 856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855 not found: ID does not exist" containerID="856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.223023 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855"} err="failed to get container status \"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855\": rpc error: code = NotFound desc = could not find container \"856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855\": container with ID starting with 856db63d99e72886311798f9bcc337791adb04e8a897975b65be61f9118b9855 not found: ID does not exist" Feb 20 07:30:37 crc kubenswrapper[5094]: I0220 07:30:37.855910 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" path="/var/lib/kubelet/pods/3427dbbc-6f50-4576-9871-52bd5a127484/volumes" Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.107059 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.107871 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.108055 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.108940 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.109006 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc" gracePeriod=600 Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.405923 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc" exitCode=0 Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.405978 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc"} Feb 20 07:31:04 crc kubenswrapper[5094]: I0220 07:31:04.406027 5094 scope.go:117] "RemoveContainer" containerID="c5220c2e2af0bf5a13d4992e89ffd079eea798484764eba37ab7b8428d552942" Feb 20 07:31:05 crc kubenswrapper[5094]: I0220 07:31:05.429413 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c"} Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.884006 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:01 crc kubenswrapper[5094]: E0220 07:32:01.885246 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="extract-content" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.885263 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="extract-content" Feb 20 07:32:01 crc kubenswrapper[5094]: E0220 07:32:01.885295 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="registry-server" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.885302 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="registry-server" Feb 20 07:32:01 crc kubenswrapper[5094]: E0220 07:32:01.885335 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="extract-utilities" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.885343 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="extract-utilities" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.885538 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3427dbbc-6f50-4576-9871-52bd5a127484" containerName="registry-server" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.886632 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.907961 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.941636 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.941773 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:01 crc kubenswrapper[5094]: I0220 07:32:01.941907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.043336 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.043830 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.043892 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.044678 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.044929 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.070081 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") pod \"certified-operators-rjxrj\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.212200 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.751873 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.997233 5094 generic.go:334] "Generic (PLEG): container finished" podID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerID="5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e" exitCode=0 Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.997301 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerDied","Data":"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e"} Feb 20 07:32:02 crc kubenswrapper[5094]: I0220 07:32:02.997346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerStarted","Data":"0f699d5d028fd0e043ec3e175912b3dd63456e67297d084d0a7054349caab64f"} Feb 20 07:32:03 crc kubenswrapper[5094]: I0220 07:32:03.000114 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:32:04 crc kubenswrapper[5094]: I0220 07:32:04.010327 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerStarted","Data":"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399"} Feb 20 07:32:05 crc kubenswrapper[5094]: I0220 07:32:05.021023 5094 generic.go:334] "Generic (PLEG): container finished" podID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerID="ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399" exitCode=0 Feb 20 07:32:05 crc kubenswrapper[5094]: I0220 07:32:05.021124 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerDied","Data":"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399"} Feb 20 07:32:06 crc kubenswrapper[5094]: I0220 07:32:06.048952 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerStarted","Data":"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a"} Feb 20 07:32:06 crc kubenswrapper[5094]: I0220 07:32:06.088649 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rjxrj" podStartSLOduration=2.679409058 podStartE2EDuration="5.088610051s" podCreationTimestamp="2026-02-20 07:32:01 +0000 UTC" firstStartedPulling="2026-02-20 07:32:02.999644631 +0000 UTC m=+2737.872271342" lastFinishedPulling="2026-02-20 07:32:05.408845594 +0000 UTC m=+2740.281472335" observedRunningTime="2026-02-20 07:32:06.077540606 +0000 UTC m=+2740.950167347" watchObservedRunningTime="2026-02-20 07:32:06.088610051 +0000 UTC m=+2740.961236792" Feb 20 07:32:12 crc kubenswrapper[5094]: I0220 07:32:12.213431 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:12 crc kubenswrapper[5094]: I0220 07:32:12.215742 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:12 crc kubenswrapper[5094]: I0220 07:32:12.290514 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:13 crc kubenswrapper[5094]: I0220 07:32:13.215853 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:13 crc kubenswrapper[5094]: I0220 07:32:13.282404 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.146791 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rjxrj" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="registry-server" containerID="cri-o://d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" gracePeriod=2 Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.736401 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.915625 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") pod \"d651298e-32ea-456a-ac1e-45aae2ab365f\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.915780 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") pod \"d651298e-32ea-456a-ac1e-45aae2ab365f\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.915942 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") pod \"d651298e-32ea-456a-ac1e-45aae2ab365f\" (UID: \"d651298e-32ea-456a-ac1e-45aae2ab365f\") " Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.917249 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities" (OuterVolumeSpecName: "utilities") pod "d651298e-32ea-456a-ac1e-45aae2ab365f" (UID: "d651298e-32ea-456a-ac1e-45aae2ab365f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.924845 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp" (OuterVolumeSpecName: "kube-api-access-ktbrp") pod "d651298e-32ea-456a-ac1e-45aae2ab365f" (UID: "d651298e-32ea-456a-ac1e-45aae2ab365f"). InnerVolumeSpecName "kube-api-access-ktbrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:32:15 crc kubenswrapper[5094]: I0220 07:32:15.968529 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d651298e-32ea-456a-ac1e-45aae2ab365f" (UID: "d651298e-32ea-456a-ac1e-45aae2ab365f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.018187 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktbrp\" (UniqueName: \"kubernetes.io/projected/d651298e-32ea-456a-ac1e-45aae2ab365f-kube-api-access-ktbrp\") on node \"crc\" DevicePath \"\"" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.018725 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.018745 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d651298e-32ea-456a-ac1e-45aae2ab365f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165267 5094 generic.go:334] "Generic (PLEG): container finished" podID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerID="d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" exitCode=0 Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerDied","Data":"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a"} Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165398 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rjxrj" event={"ID":"d651298e-32ea-456a-ac1e-45aae2ab365f","Type":"ContainerDied","Data":"0f699d5d028fd0e043ec3e175912b3dd63456e67297d084d0a7054349caab64f"} Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165453 5094 scope.go:117] "RemoveContainer" containerID="d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.165672 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rjxrj" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.206408 5094 scope.go:117] "RemoveContainer" containerID="ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.225995 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.241932 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rjxrj"] Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.251811 5094 scope.go:117] "RemoveContainer" containerID="5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.291015 5094 scope.go:117] "RemoveContainer" containerID="d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" Feb 20 07:32:16 crc kubenswrapper[5094]: E0220 07:32:16.292185 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a\": container with ID starting with d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a not found: ID does not exist" containerID="d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.292259 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a"} err="failed to get container status \"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a\": rpc error: code = NotFound desc = could not find container \"d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a\": container with ID starting with d25d9975591ae947b1302119c4f8681703e4577e32c7b55a34983143f6a6b29a not found: ID does not exist" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.292311 5094 scope.go:117] "RemoveContainer" containerID="ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399" Feb 20 07:32:16 crc kubenswrapper[5094]: E0220 07:32:16.293054 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399\": container with ID starting with ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399 not found: ID does not exist" containerID="ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.293087 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399"} err="failed to get container status \"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399\": rpc error: code = NotFound desc = could not find container \"ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399\": container with ID starting with ebd1b1d5cfefa2fe80557b7d7e5c76e3a22ae284204a9144996853e2a1cd3399 not found: ID does not exist" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.293108 5094 scope.go:117] "RemoveContainer" containerID="5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e" Feb 20 07:32:16 crc kubenswrapper[5094]: E0220 07:32:16.293728 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e\": container with ID starting with 5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e not found: ID does not exist" containerID="5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e" Feb 20 07:32:16 crc kubenswrapper[5094]: I0220 07:32:16.293812 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e"} err="failed to get container status \"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e\": rpc error: code = NotFound desc = could not find container \"5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e\": container with ID starting with 5f6720f2f59675a53d1c3e15548c768221d2db6314a2271d1f4b8ffd512c3a0e not found: ID does not exist" Feb 20 07:32:17 crc kubenswrapper[5094]: I0220 07:32:17.853348 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" path="/var/lib/kubelet/pods/d651298e-32ea-456a-ac1e-45aae2ab365f/volumes" Feb 20 07:33:04 crc kubenswrapper[5094]: I0220 07:33:04.107274 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:33:04 crc kubenswrapper[5094]: I0220 07:33:04.108363 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:33:34 crc kubenswrapper[5094]: I0220 07:33:34.107074 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:33:34 crc kubenswrapper[5094]: I0220 07:33:34.108037 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.107025 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.108105 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.108191 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.109508 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.109620 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" gracePeriod=600 Feb 20 07:34:04 crc kubenswrapper[5094]: E0220 07:34:04.246214 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.299389 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" exitCode=0 Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.299512 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c"} Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.299652 5094 scope.go:117] "RemoveContainer" containerID="d490ad7f0df91dbec2d268bb4e610d1c74d888b0923d10ec319d040c9dea3ddc" Feb 20 07:34:04 crc kubenswrapper[5094]: I0220 07:34:04.301825 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:04 crc kubenswrapper[5094]: E0220 07:34:04.302459 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:18 crc kubenswrapper[5094]: I0220 07:34:18.840991 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:18 crc kubenswrapper[5094]: E0220 07:34:18.842492 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:33 crc kubenswrapper[5094]: I0220 07:34:33.841234 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:33 crc kubenswrapper[5094]: E0220 07:34:33.842612 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:45 crc kubenswrapper[5094]: I0220 07:34:45.853885 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:45 crc kubenswrapper[5094]: E0220 07:34:45.855065 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:34:59 crc kubenswrapper[5094]: I0220 07:34:59.840634 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:34:59 crc kubenswrapper[5094]: E0220 07:34:59.841580 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:35:13 crc kubenswrapper[5094]: I0220 07:35:13.840659 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:35:13 crc kubenswrapper[5094]: E0220 07:35:13.841974 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:35:27 crc kubenswrapper[5094]: I0220 07:35:27.843293 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:35:27 crc kubenswrapper[5094]: E0220 07:35:27.844120 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:35:39 crc kubenswrapper[5094]: I0220 07:35:39.840842 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:35:39 crc kubenswrapper[5094]: E0220 07:35:39.841715 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:35:50 crc kubenswrapper[5094]: I0220 07:35:50.841157 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:35:50 crc kubenswrapper[5094]: E0220 07:35:50.842637 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:04 crc kubenswrapper[5094]: I0220 07:36:04.839965 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:04 crc kubenswrapper[5094]: E0220 07:36:04.841191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:18 crc kubenswrapper[5094]: I0220 07:36:18.841443 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:18 crc kubenswrapper[5094]: E0220 07:36:18.842674 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:30 crc kubenswrapper[5094]: I0220 07:36:30.845446 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:30 crc kubenswrapper[5094]: E0220 07:36:30.846971 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:45 crc kubenswrapper[5094]: I0220 07:36:45.844382 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:45 crc kubenswrapper[5094]: E0220 07:36:45.845294 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:36:56 crc kubenswrapper[5094]: I0220 07:36:56.841946 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:36:56 crc kubenswrapper[5094]: E0220 07:36:56.843371 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:08 crc kubenswrapper[5094]: I0220 07:37:08.841177 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:37:08 crc kubenswrapper[5094]: E0220 07:37:08.842460 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:20 crc kubenswrapper[5094]: I0220 07:37:20.841430 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:37:20 crc kubenswrapper[5094]: E0220 07:37:20.842669 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:33 crc kubenswrapper[5094]: I0220 07:37:33.842324 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:37:33 crc kubenswrapper[5094]: E0220 07:37:33.845342 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.135309 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:37:46 crc kubenswrapper[5094]: E0220 07:37:46.137574 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="extract-utilities" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.137640 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="extract-utilities" Feb 20 07:37:46 crc kubenswrapper[5094]: E0220 07:37:46.137682 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="extract-content" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.137695 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="extract-content" Feb 20 07:37:46 crc kubenswrapper[5094]: E0220 07:37:46.137734 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="registry-server" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.137744 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="registry-server" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.138349 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d651298e-32ea-456a-ac1e-45aae2ab365f" containerName="registry-server" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.142795 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.156483 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.156682 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.156677 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.156840 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.258804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.258934 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.258991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.259437 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.259518 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.283803 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") pod \"redhat-operators-z4qm5\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.484523 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:46 crc kubenswrapper[5094]: I0220 07:37:46.956101 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.578027 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerID="96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367" exitCode=0 Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.578090 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerDied","Data":"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367"} Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.578132 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerStarted","Data":"89c9e252d537af174af4376fa5796e176641ea798d855baa2c860628688dd9e3"} Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.581611 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:37:47 crc kubenswrapper[5094]: I0220 07:37:47.840829 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:37:47 crc kubenswrapper[5094]: E0220 07:37:47.841071 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:37:48 crc kubenswrapper[5094]: I0220 07:37:48.596817 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerStarted","Data":"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0"} Feb 20 07:37:49 crc kubenswrapper[5094]: I0220 07:37:49.611234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerDied","Data":"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0"} Feb 20 07:37:49 crc kubenswrapper[5094]: I0220 07:37:49.612799 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerID="cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0" exitCode=0 Feb 20 07:37:50 crc kubenswrapper[5094]: I0220 07:37:50.625929 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerStarted","Data":"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22"} Feb 20 07:37:50 crc kubenswrapper[5094]: I0220 07:37:50.654237 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z4qm5" podStartSLOduration=2.000894997 podStartE2EDuration="4.654214586s" podCreationTimestamp="2026-02-20 07:37:46 +0000 UTC" firstStartedPulling="2026-02-20 07:37:47.581173628 +0000 UTC m=+3082.453800369" lastFinishedPulling="2026-02-20 07:37:50.234493217 +0000 UTC m=+3085.107119958" observedRunningTime="2026-02-20 07:37:50.651160533 +0000 UTC m=+3085.523787254" watchObservedRunningTime="2026-02-20 07:37:50.654214586 +0000 UTC m=+3085.526841297" Feb 20 07:37:54 crc kubenswrapper[5094]: I0220 07:37:54.860890 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:37:54 crc kubenswrapper[5094]: I0220 07:37:54.864336 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:54 crc kubenswrapper[5094]: I0220 07:37:54.877564 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.030212 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.030358 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.030422 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.132434 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.132541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.132590 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.133114 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.133332 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.153016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") pod \"community-operators-85zvz\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.208134 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:37:55 crc kubenswrapper[5094]: I0220 07:37:55.717482 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.485333 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.485894 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.697851 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerID="7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529" exitCode=0 Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.697942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerDied","Data":"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529"} Feb 20 07:37:56 crc kubenswrapper[5094]: I0220 07:37:56.698015 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerStarted","Data":"db1d2b274b0420325bf7c8dcf9ca8777991c781f0ee71c959f5ecabbef484364"} Feb 20 07:37:57 crc kubenswrapper[5094]: I0220 07:37:57.557754 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z4qm5" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" probeResult="failure" output=< Feb 20 07:37:57 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:37:57 crc kubenswrapper[5094]: > Feb 20 07:37:57 crc kubenswrapper[5094]: I0220 07:37:57.713551 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerStarted","Data":"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8"} Feb 20 07:37:58 crc kubenswrapper[5094]: I0220 07:37:58.727116 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerID="9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8" exitCode=0 Feb 20 07:37:58 crc kubenswrapper[5094]: I0220 07:37:58.727192 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerDied","Data":"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8"} Feb 20 07:37:59 crc kubenswrapper[5094]: I0220 07:37:59.738973 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerStarted","Data":"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12"} Feb 20 07:37:59 crc kubenswrapper[5094]: I0220 07:37:59.765902 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85zvz" podStartSLOduration=3.304419015 podStartE2EDuration="5.765851761s" podCreationTimestamp="2026-02-20 07:37:54 +0000 UTC" firstStartedPulling="2026-02-20 07:37:56.701079751 +0000 UTC m=+3091.573706472" lastFinishedPulling="2026-02-20 07:37:59.162512497 +0000 UTC m=+3094.035139218" observedRunningTime="2026-02-20 07:37:59.759784395 +0000 UTC m=+3094.632411156" watchObservedRunningTime="2026-02-20 07:37:59.765851761 +0000 UTC m=+3094.638478482" Feb 20 07:38:00 crc kubenswrapper[5094]: I0220 07:38:00.841334 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:00 crc kubenswrapper[5094]: E0220 07:38:00.841929 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:38:05 crc kubenswrapper[5094]: I0220 07:38:05.209973 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:05 crc kubenswrapper[5094]: I0220 07:38:05.210986 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:05 crc kubenswrapper[5094]: I0220 07:38:05.424950 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:05 crc kubenswrapper[5094]: I0220 07:38:05.914404 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:06 crc kubenswrapper[5094]: I0220 07:38:06.563506 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:38:06 crc kubenswrapper[5094]: I0220 07:38:06.653225 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:38:08 crc kubenswrapper[5094]: I0220 07:38:08.873621 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:38:08 crc kubenswrapper[5094]: I0220 07:38:08.874589 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-85zvz" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="registry-server" containerID="cri-o://6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" gracePeriod=2 Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.347232 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.419466 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") pod \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.419603 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") pod \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.419893 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") pod \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\" (UID: \"aa5d2e9e-7d8a-48bf-a074-eb34159551ed\") " Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.421110 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities" (OuterVolumeSpecName: "utilities") pod "aa5d2e9e-7d8a-48bf-a074-eb34159551ed" (UID: "aa5d2e9e-7d8a-48bf-a074-eb34159551ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.431331 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv" (OuterVolumeSpecName: "kube-api-access-v9gwv") pod "aa5d2e9e-7d8a-48bf-a074-eb34159551ed" (UID: "aa5d2e9e-7d8a-48bf-a074-eb34159551ed"). InnerVolumeSpecName "kube-api-access-v9gwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.500995 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa5d2e9e-7d8a-48bf-a074-eb34159551ed" (UID: "aa5d2e9e-7d8a-48bf-a074-eb34159551ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.522202 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.522243 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.522260 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9gwv\" (UniqueName: \"kubernetes.io/projected/aa5d2e9e-7d8a-48bf-a074-eb34159551ed-kube-api-access-v9gwv\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.891624 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerID="6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" exitCode=0 Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.891774 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerDied","Data":"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12"} Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.891845 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85zvz" event={"ID":"aa5d2e9e-7d8a-48bf-a074-eb34159551ed","Type":"ContainerDied","Data":"db1d2b274b0420325bf7c8dcf9ca8777991c781f0ee71c959f5ecabbef484364"} Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.891889 5094 scope.go:117] "RemoveContainer" containerID="6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.893011 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85zvz" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.937673 5094 scope.go:117] "RemoveContainer" containerID="9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8" Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.939012 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.945424 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-85zvz"] Feb 20 07:38:09 crc kubenswrapper[5094]: I0220 07:38:09.966825 5094 scope.go:117] "RemoveContainer" containerID="7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.012587 5094 scope.go:117] "RemoveContainer" containerID="6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" Feb 20 07:38:10 crc kubenswrapper[5094]: E0220 07:38:10.013201 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12\": container with ID starting with 6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12 not found: ID does not exist" containerID="6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.013260 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12"} err="failed to get container status \"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12\": rpc error: code = NotFound desc = could not find container \"6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12\": container with ID starting with 6745393131f6a6566fa43ddaa71dd46bd7c4aa4b80b6f03dc0d4589a8d454a12 not found: ID does not exist" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.013298 5094 scope.go:117] "RemoveContainer" containerID="9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8" Feb 20 07:38:10 crc kubenswrapper[5094]: E0220 07:38:10.014027 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8\": container with ID starting with 9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8 not found: ID does not exist" containerID="9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.014114 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8"} err="failed to get container status \"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8\": rpc error: code = NotFound desc = could not find container \"9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8\": container with ID starting with 9c2b26b5816acc01824f40ce5b391535738ae133404c9be800dc0e2134006ef8 not found: ID does not exist" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.014169 5094 scope.go:117] "RemoveContainer" containerID="7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529" Feb 20 07:38:10 crc kubenswrapper[5094]: E0220 07:38:10.015176 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529\": container with ID starting with 7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529 not found: ID does not exist" containerID="7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529" Feb 20 07:38:10 crc kubenswrapper[5094]: I0220 07:38:10.015250 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529"} err="failed to get container status \"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529\": rpc error: code = NotFound desc = could not find container \"7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529\": container with ID starting with 7829db94cf14a731cf0d4a4debe8e412745f5515a5c8ef696b29616ff6379529 not found: ID does not exist" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.274453 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.275010 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z4qm5" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" containerID="cri-o://48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" gracePeriod=2 Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.797575 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.842163 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:11 crc kubenswrapper[5094]: E0220 07:38:11.842624 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.853623 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" path="/var/lib/kubelet/pods/aa5d2e9e-7d8a-48bf-a074-eb34159551ed/volumes" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.874541 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") pod \"5e5334f3-7266-4687-b9ba-9574b54c9a29\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.874652 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") pod \"5e5334f3-7266-4687-b9ba-9574b54c9a29\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.874761 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") pod \"5e5334f3-7266-4687-b9ba-9574b54c9a29\" (UID: \"5e5334f3-7266-4687-b9ba-9574b54c9a29\") " Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.875582 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities" (OuterVolumeSpecName: "utilities") pod "5e5334f3-7266-4687-b9ba-9574b54c9a29" (UID: "5e5334f3-7266-4687-b9ba-9574b54c9a29"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.882943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972" (OuterVolumeSpecName: "kube-api-access-2q972") pod "5e5334f3-7266-4687-b9ba-9574b54c9a29" (UID: "5e5334f3-7266-4687-b9ba-9574b54c9a29"). InnerVolumeSpecName "kube-api-access-2q972". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918284 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerID="48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" exitCode=0 Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918336 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerDied","Data":"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22"} Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918370 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z4qm5" event={"ID":"5e5334f3-7266-4687-b9ba-9574b54c9a29","Type":"ContainerDied","Data":"89c9e252d537af174af4376fa5796e176641ea798d855baa2c860628688dd9e3"} Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918393 5094 scope.go:117] "RemoveContainer" containerID="48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.918512 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z4qm5" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.957972 5094 scope.go:117] "RemoveContainer" containerID="cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.979393 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.979515 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q972\" (UniqueName: \"kubernetes.io/projected/5e5334f3-7266-4687-b9ba-9574b54c9a29-kube-api-access-2q972\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:11 crc kubenswrapper[5094]: I0220 07:38:11.993002 5094 scope.go:117] "RemoveContainer" containerID="96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.025372 5094 scope.go:117] "RemoveContainer" containerID="48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" Feb 20 07:38:12 crc kubenswrapper[5094]: E0220 07:38:12.025931 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22\": container with ID starting with 48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22 not found: ID does not exist" containerID="48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.026034 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22"} err="failed to get container status \"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22\": rpc error: code = NotFound desc = could not find container \"48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22\": container with ID starting with 48b1f3caf6c0a4a92f9bd9434bbade328a3a706049e01e48a1c0f40d6cf2fe22 not found: ID does not exist" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.026117 5094 scope.go:117] "RemoveContainer" containerID="cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0" Feb 20 07:38:12 crc kubenswrapper[5094]: E0220 07:38:12.026570 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0\": container with ID starting with cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0 not found: ID does not exist" containerID="cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.026649 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0"} err="failed to get container status \"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0\": rpc error: code = NotFound desc = could not find container \"cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0\": container with ID starting with cc157226f40f2c371a50404a0f66c2791e56253bdeeb60a3172855ec95badfb0 not found: ID does not exist" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.026733 5094 scope.go:117] "RemoveContainer" containerID="96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367" Feb 20 07:38:12 crc kubenswrapper[5094]: E0220 07:38:12.027176 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367\": container with ID starting with 96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367 not found: ID does not exist" containerID="96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.027251 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367"} err="failed to get container status \"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367\": rpc error: code = NotFound desc = could not find container \"96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367\": container with ID starting with 96812e61d21b8233b73669426f1d067f0b54883330c1735f586c81e3cc39b367 not found: ID does not exist" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.029483 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e5334f3-7266-4687-b9ba-9574b54c9a29" (UID: "5e5334f3-7266-4687-b9ba-9574b54c9a29"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.081956 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e5334f3-7266-4687-b9ba-9574b54c9a29-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.278251 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:38:12 crc kubenswrapper[5094]: I0220 07:38:12.285469 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z4qm5"] Feb 20 07:38:13 crc kubenswrapper[5094]: I0220 07:38:13.857616 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" path="/var/lib/kubelet/pods/5e5334f3-7266-4687-b9ba-9574b54c9a29/volumes" Feb 20 07:38:24 crc kubenswrapper[5094]: I0220 07:38:24.841042 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:24 crc kubenswrapper[5094]: E0220 07:38:24.841632 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:38:36 crc kubenswrapper[5094]: I0220 07:38:36.840483 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:36 crc kubenswrapper[5094]: E0220 07:38:36.843654 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:38:49 crc kubenswrapper[5094]: I0220 07:38:49.841334 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:38:49 crc kubenswrapper[5094]: E0220 07:38:49.843137 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:39:01 crc kubenswrapper[5094]: I0220 07:39:01.840801 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:39:01 crc kubenswrapper[5094]: E0220 07:39:01.841880 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:39:12 crc kubenswrapper[5094]: I0220 07:39:12.840418 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:39:13 crc kubenswrapper[5094]: I0220 07:39:13.522302 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070"} Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.513214 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514536 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="extract-utilities" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514564 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="extract-utilities" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514589 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="extract-content" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514601 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="extract-content" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514625 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="extract-content" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514636 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="extract-content" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514662 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514681 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="extract-utilities" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514691 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="extract-utilities" Feb 20 07:40:52 crc kubenswrapper[5094]: E0220 07:40:52.514748 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.514762 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.515015 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa5d2e9e-7d8a-48bf-a074-eb34159551ed" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.515053 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e5334f3-7266-4687-b9ba-9574b54c9a29" containerName="registry-server" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.516884 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.539500 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.671668 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.671859 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.672117 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774121 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774295 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774402 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774770 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.774953 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.796378 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") pod \"redhat-marketplace-5zpzv\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:52 crc kubenswrapper[5094]: I0220 07:40:52.855248 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:40:53 crc kubenswrapper[5094]: I0220 07:40:53.377389 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:40:53 crc kubenswrapper[5094]: I0220 07:40:53.620357 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc53884f-e926-4579-956a-0c9719af5d1e" containerID="c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf" exitCode=0 Feb 20 07:40:53 crc kubenswrapper[5094]: I0220 07:40:53.620416 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerDied","Data":"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf"} Feb 20 07:40:53 crc kubenswrapper[5094]: I0220 07:40:53.620451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerStarted","Data":"29f8cb5dcecf22a8c0b8b2443cdd71954f5dcbf716659229fa65a0f279b47e16"} Feb 20 07:40:54 crc kubenswrapper[5094]: I0220 07:40:54.634595 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc53884f-e926-4579-956a-0c9719af5d1e" containerID="16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89" exitCode=0 Feb 20 07:40:54 crc kubenswrapper[5094]: I0220 07:40:54.634735 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerDied","Data":"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89"} Feb 20 07:40:55 crc kubenswrapper[5094]: I0220 07:40:55.646030 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerStarted","Data":"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91"} Feb 20 07:40:55 crc kubenswrapper[5094]: I0220 07:40:55.684957 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5zpzv" podStartSLOduration=2.250436356 podStartE2EDuration="3.684933057s" podCreationTimestamp="2026-02-20 07:40:52 +0000 UTC" firstStartedPulling="2026-02-20 07:40:53.623370363 +0000 UTC m=+3268.495997074" lastFinishedPulling="2026-02-20 07:40:55.057867054 +0000 UTC m=+3269.930493775" observedRunningTime="2026-02-20 07:40:55.677654513 +0000 UTC m=+3270.550281294" watchObservedRunningTime="2026-02-20 07:40:55.684933057 +0000 UTC m=+3270.557559768" Feb 20 07:41:02 crc kubenswrapper[5094]: I0220 07:41:02.856037 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:02 crc kubenswrapper[5094]: I0220 07:41:02.857831 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:02 crc kubenswrapper[5094]: I0220 07:41:02.921873 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:03 crc kubenswrapper[5094]: I0220 07:41:03.770476 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:03 crc kubenswrapper[5094]: I0220 07:41:03.838156 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:41:05 crc kubenswrapper[5094]: I0220 07:41:05.731547 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5zpzv" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="registry-server" containerID="cri-o://a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" gracePeriod=2 Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.216225 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.324440 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") pod \"fc53884f-e926-4579-956a-0c9719af5d1e\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.324497 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") pod \"fc53884f-e926-4579-956a-0c9719af5d1e\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.324575 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") pod \"fc53884f-e926-4579-956a-0c9719af5d1e\" (UID: \"fc53884f-e926-4579-956a-0c9719af5d1e\") " Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.325868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities" (OuterVolumeSpecName: "utilities") pod "fc53884f-e926-4579-956a-0c9719af5d1e" (UID: "fc53884f-e926-4579-956a-0c9719af5d1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.331229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d" (OuterVolumeSpecName: "kube-api-access-czt8d") pod "fc53884f-e926-4579-956a-0c9719af5d1e" (UID: "fc53884f-e926-4579-956a-0c9719af5d1e"). InnerVolumeSpecName "kube-api-access-czt8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.351735 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc53884f-e926-4579-956a-0c9719af5d1e" (UID: "fc53884f-e926-4579-956a-0c9719af5d1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.426121 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.426159 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc53884f-e926-4579-956a-0c9719af5d1e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.426172 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czt8d\" (UniqueName: \"kubernetes.io/projected/fc53884f-e926-4579-956a-0c9719af5d1e-kube-api-access-czt8d\") on node \"crc\" DevicePath \"\"" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763385 5094 generic.go:334] "Generic (PLEG): container finished" podID="fc53884f-e926-4579-956a-0c9719af5d1e" containerID="a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" exitCode=0 Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763460 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerDied","Data":"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91"} Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763519 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5zpzv" event={"ID":"fc53884f-e926-4579-956a-0c9719af5d1e","Type":"ContainerDied","Data":"29f8cb5dcecf22a8c0b8b2443cdd71954f5dcbf716659229fa65a0f279b47e16"} Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763552 5094 scope.go:117] "RemoveContainer" containerID="a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.763554 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5zpzv" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.801217 5094 scope.go:117] "RemoveContainer" containerID="16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.828063 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.841064 5094 scope.go:117] "RemoveContainer" containerID="c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.846065 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5zpzv"] Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.865294 5094 scope.go:117] "RemoveContainer" containerID="a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" Feb 20 07:41:06 crc kubenswrapper[5094]: E0220 07:41:06.865844 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91\": container with ID starting with a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91 not found: ID does not exist" containerID="a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.865884 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91"} err="failed to get container status \"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91\": rpc error: code = NotFound desc = could not find container \"a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91\": container with ID starting with a9a00fd798c4e21cff5dbd987e0c9c633172ed17f009cc1b9114dc3f412b6f91 not found: ID does not exist" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.865914 5094 scope.go:117] "RemoveContainer" containerID="16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89" Feb 20 07:41:06 crc kubenswrapper[5094]: E0220 07:41:06.866434 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89\": container with ID starting with 16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89 not found: ID does not exist" containerID="16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.866463 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89"} err="failed to get container status \"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89\": rpc error: code = NotFound desc = could not find container \"16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89\": container with ID starting with 16452559f322a735a8317b7a294e5189e9d7b6b8045c95aa3d711b6779032c89 not found: ID does not exist" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.866482 5094 scope.go:117] "RemoveContainer" containerID="c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf" Feb 20 07:41:06 crc kubenswrapper[5094]: E0220 07:41:06.866875 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf\": container with ID starting with c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf not found: ID does not exist" containerID="c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf" Feb 20 07:41:06 crc kubenswrapper[5094]: I0220 07:41:06.866902 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf"} err="failed to get container status \"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf\": rpc error: code = NotFound desc = could not find container \"c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf\": container with ID starting with c5708f17c871b826ce6a820ca60584c66b7256d5db35ca619fd1c4b2123c3bcf not found: ID does not exist" Feb 20 07:41:07 crc kubenswrapper[5094]: I0220 07:41:07.856476 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" path="/var/lib/kubelet/pods/fc53884f-e926-4579-956a-0c9719af5d1e/volumes" Feb 20 07:41:34 crc kubenswrapper[5094]: I0220 07:41:34.107317 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:41:34 crc kubenswrapper[5094]: I0220 07:41:34.108088 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:42:04 crc kubenswrapper[5094]: I0220 07:42:04.106954 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:42:04 crc kubenswrapper[5094]: I0220 07:42:04.108031 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.106643 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.107501 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.107581 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.108619 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.108765 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070" gracePeriod=600 Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.666019 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070" exitCode=0 Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.666055 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070"} Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.666579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949"} Feb 20 07:42:34 crc kubenswrapper[5094]: I0220 07:42:34.666607 5094 scope.go:117] "RemoveContainer" containerID="66e673e2a79c6f1dde87c9e7a68a4265391a0d5f787a8992bbf680614214815c" Feb 20 07:44:34 crc kubenswrapper[5094]: I0220 07:44:34.107458 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:44:34 crc kubenswrapper[5094]: I0220 07:44:34.109988 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.173343 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 07:45:00 crc kubenswrapper[5094]: E0220 07:45:00.174940 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="extract-utilities" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.174967 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="extract-utilities" Feb 20 07:45:00 crc kubenswrapper[5094]: E0220 07:45:00.174988 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="registry-server" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.175000 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="registry-server" Feb 20 07:45:00 crc kubenswrapper[5094]: E0220 07:45:00.175033 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="extract-content" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.175044 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="extract-content" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.175264 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc53884f-e926-4579-956a-0c9719af5d1e" containerName="registry-server" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.176025 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.179789 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.180140 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.180978 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.258614 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.259006 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.259051 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.360889 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.361311 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.361360 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.364518 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.374002 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.395660 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") pod \"collect-profiles-29526225-wq7wz\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.515675 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:00 crc kubenswrapper[5094]: I0220 07:45:00.807773 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 07:45:01 crc kubenswrapper[5094]: I0220 07:45:01.139921 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" event={"ID":"1c1d2dad-446d-40c2-aceb-de13411f5c93","Type":"ContainerStarted","Data":"362eb9911b24b0b4b251a0b53f012f2ce0407b0cfe5ced9cb86222665657d8eb"} Feb 20 07:45:01 crc kubenswrapper[5094]: I0220 07:45:01.140433 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" event={"ID":"1c1d2dad-446d-40c2-aceb-de13411f5c93","Type":"ContainerStarted","Data":"0590d2f3731c18e064c43a35704333d310e3d17c842b6353c78f230a2bbbb357"} Feb 20 07:45:01 crc kubenswrapper[5094]: I0220 07:45:01.171101 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" podStartSLOduration=1.169931681 podStartE2EDuration="1.169931681s" podCreationTimestamp="2026-02-20 07:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 07:45:01.163945316 +0000 UTC m=+3516.036572067" watchObservedRunningTime="2026-02-20 07:45:01.169931681 +0000 UTC m=+3516.042558402" Feb 20 07:45:02 crc kubenswrapper[5094]: I0220 07:45:02.155607 5094 generic.go:334] "Generic (PLEG): container finished" podID="1c1d2dad-446d-40c2-aceb-de13411f5c93" containerID="362eb9911b24b0b4b251a0b53f012f2ce0407b0cfe5ced9cb86222665657d8eb" exitCode=0 Feb 20 07:45:02 crc kubenswrapper[5094]: I0220 07:45:02.155697 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" event={"ID":"1c1d2dad-446d-40c2-aceb-de13411f5c93","Type":"ContainerDied","Data":"362eb9911b24b0b4b251a0b53f012f2ce0407b0cfe5ced9cb86222665657d8eb"} Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.567238 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.724240 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") pod \"1c1d2dad-446d-40c2-aceb-de13411f5c93\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.724406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") pod \"1c1d2dad-446d-40c2-aceb-de13411f5c93\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.724452 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") pod \"1c1d2dad-446d-40c2-aceb-de13411f5c93\" (UID: \"1c1d2dad-446d-40c2-aceb-de13411f5c93\") " Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.725515 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume" (OuterVolumeSpecName: "config-volume") pod "1c1d2dad-446d-40c2-aceb-de13411f5c93" (UID: "1c1d2dad-446d-40c2-aceb-de13411f5c93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.725842 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c1d2dad-446d-40c2-aceb-de13411f5c93-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.731537 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp" (OuterVolumeSpecName: "kube-api-access-9f8zp") pod "1c1d2dad-446d-40c2-aceb-de13411f5c93" (UID: "1c1d2dad-446d-40c2-aceb-de13411f5c93"). InnerVolumeSpecName "kube-api-access-9f8zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.735033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1c1d2dad-446d-40c2-aceb-de13411f5c93" (UID: "1c1d2dad-446d-40c2-aceb-de13411f5c93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.827288 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f8zp\" (UniqueName: \"kubernetes.io/projected/1c1d2dad-446d-40c2-aceb-de13411f5c93-kube-api-access-9f8zp\") on node \"crc\" DevicePath \"\"" Feb 20 07:45:03 crc kubenswrapper[5094]: I0220 07:45:03.827333 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1c1d2dad-446d-40c2-aceb-de13411f5c93-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.106892 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.107028 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.178275 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" event={"ID":"1c1d2dad-446d-40c2-aceb-de13411f5c93","Type":"ContainerDied","Data":"0590d2f3731c18e064c43a35704333d310e3d17c842b6353c78f230a2bbbb357"} Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.178347 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0590d2f3731c18e064c43a35704333d310e3d17c842b6353c78f230a2bbbb357" Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.178401 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz" Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.257423 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:45:04 crc kubenswrapper[5094]: I0220 07:45:04.269625 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526180-74wt5"] Feb 20 07:45:05 crc kubenswrapper[5094]: I0220 07:45:05.851334 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76c5d78f-89fe-4ed3-b64e-52de9f0dec4d" path="/var/lib/kubelet/pods/76c5d78f-89fe-4ed3-b64e-52de9f0dec4d/volumes" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.106648 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.107756 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.107888 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.109160 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.109273 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" gracePeriod=600 Feb 20 07:45:34 crc kubenswrapper[5094]: E0220 07:45:34.247347 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.521260 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" exitCode=0 Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.521699 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949"} Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.522057 5094 scope.go:117] "RemoveContainer" containerID="75ecd0d14bd85ee39b5bd2719875383b1d61189bd8817e3a936faa65a117f070" Feb 20 07:45:34 crc kubenswrapper[5094]: I0220 07:45:34.523053 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:45:34 crc kubenswrapper[5094]: E0220 07:45:34.523494 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:45:35 crc kubenswrapper[5094]: I0220 07:45:35.268464 5094 scope.go:117] "RemoveContainer" containerID="a3c7984448d7f3db690223dff864550548436ad39114dac24772a87d3288c8ea" Feb 20 07:45:45 crc kubenswrapper[5094]: I0220 07:45:45.849589 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:45:45 crc kubenswrapper[5094]: E0220 07:45:45.850921 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:45:56 crc kubenswrapper[5094]: I0220 07:45:56.842413 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:45:56 crc kubenswrapper[5094]: E0220 07:45:56.843980 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:09 crc kubenswrapper[5094]: I0220 07:46:09.840985 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:09 crc kubenswrapper[5094]: E0220 07:46:09.842164 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:22 crc kubenswrapper[5094]: I0220 07:46:22.841122 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:22 crc kubenswrapper[5094]: E0220 07:46:22.842676 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:35 crc kubenswrapper[5094]: I0220 07:46:35.841137 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:35 crc kubenswrapper[5094]: E0220 07:46:35.843199 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:46 crc kubenswrapper[5094]: I0220 07:46:46.840444 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:46 crc kubenswrapper[5094]: E0220 07:46:46.841591 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:46:59 crc kubenswrapper[5094]: I0220 07:46:59.840129 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:46:59 crc kubenswrapper[5094]: E0220 07:46:59.841121 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:47:14 crc kubenswrapper[5094]: I0220 07:47:14.841976 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:47:14 crc kubenswrapper[5094]: E0220 07:47:14.842992 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:47:28 crc kubenswrapper[5094]: I0220 07:47:28.840230 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:47:28 crc kubenswrapper[5094]: E0220 07:47:28.841310 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:47:39 crc kubenswrapper[5094]: I0220 07:47:39.840589 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:47:39 crc kubenswrapper[5094]: E0220 07:47:39.841987 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:47:50 crc kubenswrapper[5094]: I0220 07:47:50.842222 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:47:50 crc kubenswrapper[5094]: E0220 07:47:50.843749 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.444893 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:02 crc kubenswrapper[5094]: E0220 07:48:02.446178 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c1d2dad-446d-40c2-aceb-de13411f5c93" containerName="collect-profiles" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.446198 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c1d2dad-446d-40c2-aceb-de13411f5c93" containerName="collect-profiles" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.446418 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c1d2dad-446d-40c2-aceb-de13411f5c93" containerName="collect-profiles" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.447809 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.461049 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.559408 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.559734 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.559798 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.661337 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.661483 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.661518 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.662318 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.662598 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.698727 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") pod \"redhat-operators-npwbj\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.778644 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:02 crc kubenswrapper[5094]: I0220 07:48:02.840413 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:02 crc kubenswrapper[5094]: E0220 07:48:02.841046 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:03 crc kubenswrapper[5094]: I0220 07:48:03.358513 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:04 crc kubenswrapper[5094]: I0220 07:48:04.077497 5094 generic.go:334] "Generic (PLEG): container finished" podID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerID="2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745" exitCode=0 Feb 20 07:48:04 crc kubenswrapper[5094]: I0220 07:48:04.077584 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerDied","Data":"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745"} Feb 20 07:48:04 crc kubenswrapper[5094]: I0220 07:48:04.077988 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerStarted","Data":"c0599642c8d98050e1f7e47d03328b62c53a4b4fd04a8ac2ddf65e7ae4f3f5e8"} Feb 20 07:48:04 crc kubenswrapper[5094]: I0220 07:48:04.081064 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:48:05 crc kubenswrapper[5094]: I0220 07:48:05.091120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerStarted","Data":"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717"} Feb 20 07:48:06 crc kubenswrapper[5094]: I0220 07:48:06.101647 5094 generic.go:334] "Generic (PLEG): container finished" podID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerID="9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717" exitCode=0 Feb 20 07:48:06 crc kubenswrapper[5094]: I0220 07:48:06.101749 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerDied","Data":"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717"} Feb 20 07:48:07 crc kubenswrapper[5094]: I0220 07:48:07.114521 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerStarted","Data":"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e"} Feb 20 07:48:07 crc kubenswrapper[5094]: I0220 07:48:07.154463 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-npwbj" podStartSLOduration=2.736642839 podStartE2EDuration="5.154434659s" podCreationTimestamp="2026-02-20 07:48:02 +0000 UTC" firstStartedPulling="2026-02-20 07:48:04.080784637 +0000 UTC m=+3698.953411358" lastFinishedPulling="2026-02-20 07:48:06.498576457 +0000 UTC m=+3701.371203178" observedRunningTime="2026-02-20 07:48:07.14485962 +0000 UTC m=+3702.017486371" watchObservedRunningTime="2026-02-20 07:48:07.154434659 +0000 UTC m=+3702.027061410" Feb 20 07:48:12 crc kubenswrapper[5094]: I0220 07:48:12.779180 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:12 crc kubenswrapper[5094]: I0220 07:48:12.780955 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:13 crc kubenswrapper[5094]: I0220 07:48:13.852596 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-npwbj" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" probeResult="failure" output=< Feb 20 07:48:13 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:48:13 crc kubenswrapper[5094]: > Feb 20 07:48:14 crc kubenswrapper[5094]: I0220 07:48:14.841358 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:14 crc kubenswrapper[5094]: E0220 07:48:14.842186 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:22 crc kubenswrapper[5094]: I0220 07:48:22.854633 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:22 crc kubenswrapper[5094]: I0220 07:48:22.933276 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:23 crc kubenswrapper[5094]: I0220 07:48:23.105830 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.325364 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-npwbj" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" containerID="cri-o://7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" gracePeriod=2 Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.850780 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.975971 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") pod \"acf90a1a-02eb-43e2-9533-ca348b502c35\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.976131 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") pod \"acf90a1a-02eb-43e2-9533-ca348b502c35\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.976236 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") pod \"acf90a1a-02eb-43e2-9533-ca348b502c35\" (UID: \"acf90a1a-02eb-43e2-9533-ca348b502c35\") " Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.978111 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities" (OuterVolumeSpecName: "utilities") pod "acf90a1a-02eb-43e2-9533-ca348b502c35" (UID: "acf90a1a-02eb-43e2-9533-ca348b502c35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:48:24 crc kubenswrapper[5094]: I0220 07:48:24.987095 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s" (OuterVolumeSpecName: "kube-api-access-8vj9s") pod "acf90a1a-02eb-43e2-9533-ca348b502c35" (UID: "acf90a1a-02eb-43e2-9533-ca348b502c35"). InnerVolumeSpecName "kube-api-access-8vj9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.078790 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.078847 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vj9s\" (UniqueName: \"kubernetes.io/projected/acf90a1a-02eb-43e2-9533-ca348b502c35-kube-api-access-8vj9s\") on node \"crc\" DevicePath \"\"" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.154597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acf90a1a-02eb-43e2-9533-ca348b502c35" (UID: "acf90a1a-02eb-43e2-9533-ca348b502c35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.180565 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf90a1a-02eb-43e2-9533-ca348b502c35-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338559 5094 generic.go:334] "Generic (PLEG): container finished" podID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerID="7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" exitCode=0 Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338633 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerDied","Data":"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e"} Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338657 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npwbj" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338776 5094 scope.go:117] "RemoveContainer" containerID="7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.338752 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npwbj" event={"ID":"acf90a1a-02eb-43e2-9533-ca348b502c35","Type":"ContainerDied","Data":"c0599642c8d98050e1f7e47d03328b62c53a4b4fd04a8ac2ddf65e7ae4f3f5e8"} Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.371415 5094 scope.go:117] "RemoveContainer" containerID="9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.390980 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.408028 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-npwbj"] Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.411066 5094 scope.go:117] "RemoveContainer" containerID="2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.443279 5094 scope.go:117] "RemoveContainer" containerID="7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" Feb 20 07:48:25 crc kubenswrapper[5094]: E0220 07:48:25.444858 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e\": container with ID starting with 7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e not found: ID does not exist" containerID="7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.444911 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e"} err="failed to get container status \"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e\": rpc error: code = NotFound desc = could not find container \"7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e\": container with ID starting with 7961c8adb678afe9de644fbcd5dcc9c1fe579a81300a9f219bfc31cc94e5ed4e not found: ID does not exist" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.445030 5094 scope.go:117] "RemoveContainer" containerID="9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717" Feb 20 07:48:25 crc kubenswrapper[5094]: E0220 07:48:25.445490 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717\": container with ID starting with 9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717 not found: ID does not exist" containerID="9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.445542 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717"} err="failed to get container status \"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717\": rpc error: code = NotFound desc = could not find container \"9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717\": container with ID starting with 9a86fc11356d6af5a7cedfc80ec43da9122b830459a8c0070ff2d5aebd9d1717 not found: ID does not exist" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.445576 5094 scope.go:117] "RemoveContainer" containerID="2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745" Feb 20 07:48:25 crc kubenswrapper[5094]: E0220 07:48:25.446230 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745\": container with ID starting with 2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745 not found: ID does not exist" containerID="2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.446267 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745"} err="failed to get container status \"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745\": rpc error: code = NotFound desc = could not find container \"2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745\": container with ID starting with 2c24953b240ddf964b6379fe0d758314d8b25c77572be0dea4c9689428fe0745 not found: ID does not exist" Feb 20 07:48:25 crc kubenswrapper[5094]: I0220 07:48:25.858306 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" path="/var/lib/kubelet/pods/acf90a1a-02eb-43e2-9533-ca348b502c35/volumes" Feb 20 07:48:28 crc kubenswrapper[5094]: I0220 07:48:28.840974 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:28 crc kubenswrapper[5094]: E0220 07:48:28.844218 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:43 crc kubenswrapper[5094]: I0220 07:48:43.841328 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:43 crc kubenswrapper[5094]: E0220 07:48:43.842616 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:48:56 crc kubenswrapper[5094]: I0220 07:48:56.840620 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:48:56 crc kubenswrapper[5094]: E0220 07:48:56.841974 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:49:09 crc kubenswrapper[5094]: I0220 07:49:09.840507 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:49:09 crc kubenswrapper[5094]: E0220 07:49:09.841810 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:49:21 crc kubenswrapper[5094]: I0220 07:49:21.841410 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:49:21 crc kubenswrapper[5094]: E0220 07:49:21.843950 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:49:35 crc kubenswrapper[5094]: I0220 07:49:35.882966 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:49:35 crc kubenswrapper[5094]: E0220 07:49:35.885940 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:49:48 crc kubenswrapper[5094]: I0220 07:49:48.840812 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:49:48 crc kubenswrapper[5094]: E0220 07:49:48.842111 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:50:03 crc kubenswrapper[5094]: I0220 07:50:03.841138 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:50:03 crc kubenswrapper[5094]: E0220 07:50:03.842846 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:50:18 crc kubenswrapper[5094]: I0220 07:50:18.840474 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:50:18 crc kubenswrapper[5094]: E0220 07:50:18.841432 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:50:32 crc kubenswrapper[5094]: I0220 07:50:32.841180 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:50:32 crc kubenswrapper[5094]: E0220 07:50:32.843206 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:50:46 crc kubenswrapper[5094]: I0220 07:50:46.840194 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:50:47 crc kubenswrapper[5094]: I0220 07:50:47.864161 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe"} Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.924941 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:00 crc kubenswrapper[5094]: E0220 07:51:00.926762 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.926792 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" Feb 20 07:51:00 crc kubenswrapper[5094]: E0220 07:51:00.926812 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="extract-utilities" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.926829 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="extract-utilities" Feb 20 07:51:00 crc kubenswrapper[5094]: E0220 07:51:00.926869 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="extract-content" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.926887 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="extract-content" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.927176 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf90a1a-02eb-43e2-9533-ca348b502c35" containerName="registry-server" Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.966553 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:00 crc kubenswrapper[5094]: I0220 07:51:00.966761 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.019317 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.019431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.019544 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.121328 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.121425 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.121486 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.122115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.122351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.452073 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") pod \"redhat-marketplace-nqmbf\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:01 crc kubenswrapper[5094]: I0220 07:51:01.596320 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:02 crc kubenswrapper[5094]: I0220 07:51:02.095740 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:03 crc kubenswrapper[5094]: I0220 07:51:03.030183 5094 generic.go:334] "Generic (PLEG): container finished" podID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerID="4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf" exitCode=0 Feb 20 07:51:03 crc kubenswrapper[5094]: I0220 07:51:03.030466 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerDied","Data":"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf"} Feb 20 07:51:03 crc kubenswrapper[5094]: I0220 07:51:03.030814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerStarted","Data":"0984cce57642e3526adbaa0385b41cc36a181fef4aa681377f1c1423aa7645e7"} Feb 20 07:51:05 crc kubenswrapper[5094]: I0220 07:51:05.056596 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerStarted","Data":"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e"} Feb 20 07:51:06 crc kubenswrapper[5094]: I0220 07:51:06.070020 5094 generic.go:334] "Generic (PLEG): container finished" podID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerID="811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e" exitCode=0 Feb 20 07:51:06 crc kubenswrapper[5094]: I0220 07:51:06.070137 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerDied","Data":"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e"} Feb 20 07:51:07 crc kubenswrapper[5094]: I0220 07:51:07.099443 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerStarted","Data":"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c"} Feb 20 07:51:07 crc kubenswrapper[5094]: I0220 07:51:07.141356 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nqmbf" podStartSLOduration=3.707493975 podStartE2EDuration="7.1413118s" podCreationTimestamp="2026-02-20 07:51:00 +0000 UTC" firstStartedPulling="2026-02-20 07:51:03.03408804 +0000 UTC m=+3877.906714791" lastFinishedPulling="2026-02-20 07:51:06.467905855 +0000 UTC m=+3881.340532616" observedRunningTime="2026-02-20 07:51:07.132513098 +0000 UTC m=+3882.005139899" watchObservedRunningTime="2026-02-20 07:51:07.1413118 +0000 UTC m=+3882.013938551" Feb 20 07:51:11 crc kubenswrapper[5094]: I0220 07:51:11.597392 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:11 crc kubenswrapper[5094]: I0220 07:51:11.598372 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:11 crc kubenswrapper[5094]: I0220 07:51:11.667186 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:12 crc kubenswrapper[5094]: I0220 07:51:12.203222 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:12 crc kubenswrapper[5094]: I0220 07:51:12.292296 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.169139 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nqmbf" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="registry-server" containerID="cri-o://96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" gracePeriod=2 Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.743504 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.907796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") pod \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.907884 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") pod \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.908023 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") pod \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\" (UID: \"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea\") " Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.909990 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities" (OuterVolumeSpecName: "utilities") pod "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" (UID: "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:51:14 crc kubenswrapper[5094]: I0220 07:51:14.947939 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" (UID: "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.010787 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.010869 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.144785 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz" (OuterVolumeSpecName: "kube-api-access-72wpz") pod "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" (UID: "10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea"). InnerVolumeSpecName "kube-api-access-72wpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195274 5094 generic.go:334] "Generic (PLEG): container finished" podID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerID="96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" exitCode=0 Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerDied","Data":"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c"} Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nqmbf" event={"ID":"10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea","Type":"ContainerDied","Data":"0984cce57642e3526adbaa0385b41cc36a181fef4aa681377f1c1423aa7645e7"} Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195409 5094 scope.go:117] "RemoveContainer" containerID="96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.195792 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nqmbf" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.215380 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72wpz\" (UniqueName: \"kubernetes.io/projected/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea-kube-api-access-72wpz\") on node \"crc\" DevicePath \"\"" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.235917 5094 scope.go:117] "RemoveContainer" containerID="811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.253798 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.268572 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nqmbf"] Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.359095 5094 scope.go:117] "RemoveContainer" containerID="4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.389044 5094 scope.go:117] "RemoveContainer" containerID="96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" Feb 20 07:51:15 crc kubenswrapper[5094]: E0220 07:51:15.389668 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c\": container with ID starting with 96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c not found: ID does not exist" containerID="96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.389786 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c"} err="failed to get container status \"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c\": rpc error: code = NotFound desc = could not find container \"96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c\": container with ID starting with 96dde62521256e51aa04b75061f5e0c11d74ceb462326152b67cb1c7bb89609c not found: ID does not exist" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.389849 5094 scope.go:117] "RemoveContainer" containerID="811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e" Feb 20 07:51:15 crc kubenswrapper[5094]: E0220 07:51:15.390419 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e\": container with ID starting with 811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e not found: ID does not exist" containerID="811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.390488 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e"} err="failed to get container status \"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e\": rpc error: code = NotFound desc = could not find container \"811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e\": container with ID starting with 811576eef7a821dca9af2ed9a0b2b13836a41f67b8667b77e774e371ca39504e not found: ID does not exist" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.390532 5094 scope.go:117] "RemoveContainer" containerID="4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf" Feb 20 07:51:15 crc kubenswrapper[5094]: E0220 07:51:15.391222 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf\": container with ID starting with 4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf not found: ID does not exist" containerID="4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.391274 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf"} err="failed to get container status \"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf\": rpc error: code = NotFound desc = could not find container \"4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf\": container with ID starting with 4d03e404f80e8df8f6097f4bd57639027307f3fdb1534d2065714dc427ed3daf not found: ID does not exist" Feb 20 07:51:15 crc kubenswrapper[5094]: I0220 07:51:15.857739 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" path="/var/lib/kubelet/pods/10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea/volumes" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.751929 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:51:47 crc kubenswrapper[5094]: E0220 07:51:47.753379 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="extract-content" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.753397 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="extract-content" Feb 20 07:51:47 crc kubenswrapper[5094]: E0220 07:51:47.753423 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="extract-utilities" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.753432 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="extract-utilities" Feb 20 07:51:47 crc kubenswrapper[5094]: E0220 07:51:47.753464 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="registry-server" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.753473 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="registry-server" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.753668 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="10bd2f4d-0c1f-4daf-b52c-bcfcf28251ea" containerName="registry-server" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.755067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.777405 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.865827 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.865914 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.865941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.967194 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.967293 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.967312 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.968386 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.968942 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:47 crc kubenswrapper[5094]: I0220 07:51:47.992105 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") pod \"certified-operators-4wbdf\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:48 crc kubenswrapper[5094]: I0220 07:51:48.085682 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:48 crc kubenswrapper[5094]: I0220 07:51:48.636205 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:51:49 crc kubenswrapper[5094]: I0220 07:51:49.529330 5094 generic.go:334] "Generic (PLEG): container finished" podID="4a369096-f833-46f9-93c3-f05f985168c2" containerID="12fc2e66175a94ce2eb8ceb7a1f31e3188bf472dcb51c49a41f1fd905ecc9e2b" exitCode=0 Feb 20 07:51:49 crc kubenswrapper[5094]: I0220 07:51:49.529412 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerDied","Data":"12fc2e66175a94ce2eb8ceb7a1f31e3188bf472dcb51c49a41f1fd905ecc9e2b"} Feb 20 07:51:49 crc kubenswrapper[5094]: I0220 07:51:49.529851 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerStarted","Data":"72202432c6c5201712870e02191031f3b7dfe04b8cff9bef61b9a586db1fda5c"} Feb 20 07:51:50 crc kubenswrapper[5094]: I0220 07:51:50.541694 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerStarted","Data":"d85ceef357b4056e2444fbbf12081104dafde0672362ef4e97ea55c60685c29c"} Feb 20 07:51:51 crc kubenswrapper[5094]: I0220 07:51:51.554614 5094 generic.go:334] "Generic (PLEG): container finished" podID="4a369096-f833-46f9-93c3-f05f985168c2" containerID="d85ceef357b4056e2444fbbf12081104dafde0672362ef4e97ea55c60685c29c" exitCode=0 Feb 20 07:51:51 crc kubenswrapper[5094]: I0220 07:51:51.554735 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerDied","Data":"d85ceef357b4056e2444fbbf12081104dafde0672362ef4e97ea55c60685c29c"} Feb 20 07:51:52 crc kubenswrapper[5094]: I0220 07:51:52.570171 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerStarted","Data":"d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5"} Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.086744 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.089911 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.173566 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.206093 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4wbdf" podStartSLOduration=8.73740356 podStartE2EDuration="11.206045599s" podCreationTimestamp="2026-02-20 07:51:47 +0000 UTC" firstStartedPulling="2026-02-20 07:51:49.53152548 +0000 UTC m=+3924.404152191" lastFinishedPulling="2026-02-20 07:51:52.000167479 +0000 UTC m=+3926.872794230" observedRunningTime="2026-02-20 07:51:52.60090249 +0000 UTC m=+3927.473529251" watchObservedRunningTime="2026-02-20 07:51:58.206045599 +0000 UTC m=+3933.078672340" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.663770 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:51:58 crc kubenswrapper[5094]: I0220 07:51:58.711411 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:52:00 crc kubenswrapper[5094]: I0220 07:52:00.633887 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4wbdf" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="registry-server" containerID="cri-o://d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5" gracePeriod=2 Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.653072 5094 generic.go:334] "Generic (PLEG): container finished" podID="4a369096-f833-46f9-93c3-f05f985168c2" containerID="d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5" exitCode=0 Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.653170 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerDied","Data":"d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5"} Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.711957 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.828679 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") pod \"4a369096-f833-46f9-93c3-f05f985168c2\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.828809 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") pod \"4a369096-f833-46f9-93c3-f05f985168c2\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.828868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") pod \"4a369096-f833-46f9-93c3-f05f985168c2\" (UID: \"4a369096-f833-46f9-93c3-f05f985168c2\") " Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.830515 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities" (OuterVolumeSpecName: "utilities") pod "4a369096-f833-46f9-93c3-f05f985168c2" (UID: "4a369096-f833-46f9-93c3-f05f985168c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.840085 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z" (OuterVolumeSpecName: "kube-api-access-xch2z") pod "4a369096-f833-46f9-93c3-f05f985168c2" (UID: "4a369096-f833-46f9-93c3-f05f985168c2"). InnerVolumeSpecName "kube-api-access-xch2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.907999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a369096-f833-46f9-93c3-f05f985168c2" (UID: "4a369096-f833-46f9-93c3-f05f985168c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.934865 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.935504 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a369096-f833-46f9-93c3-f05f985168c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:52:01 crc kubenswrapper[5094]: I0220 07:52:01.935610 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xch2z\" (UniqueName: \"kubernetes.io/projected/4a369096-f833-46f9-93c3-f05f985168c2-kube-api-access-xch2z\") on node \"crc\" DevicePath \"\"" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.671999 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wbdf" event={"ID":"4a369096-f833-46f9-93c3-f05f985168c2","Type":"ContainerDied","Data":"72202432c6c5201712870e02191031f3b7dfe04b8cff9bef61b9a586db1fda5c"} Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.672112 5094 scope.go:117] "RemoveContainer" containerID="d996f31224706728c31d0388d14ef7d56c00937fb75914e5917b1e5b7304f2d5" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.672123 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wbdf" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.706905 5094 scope.go:117] "RemoveContainer" containerID="d85ceef357b4056e2444fbbf12081104dafde0672362ef4e97ea55c60685c29c" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.745384 5094 scope.go:117] "RemoveContainer" containerID="12fc2e66175a94ce2eb8ceb7a1f31e3188bf472dcb51c49a41f1fd905ecc9e2b" Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.748592 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:52:02 crc kubenswrapper[5094]: I0220 07:52:02.760275 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4wbdf"] Feb 20 07:52:03 crc kubenswrapper[5094]: I0220 07:52:03.861228 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a369096-f833-46f9-93c3-f05f985168c2" path="/var/lib/kubelet/pods/4a369096-f833-46f9-93c3-f05f985168c2/volumes" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.356288 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bhkm8"] Feb 20 07:53:03 crc kubenswrapper[5094]: E0220 07:53:03.357621 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="extract-content" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.357644 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="extract-content" Feb 20 07:53:03 crc kubenswrapper[5094]: E0220 07:53:03.357683 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="registry-server" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.357695 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="registry-server" Feb 20 07:53:03 crc kubenswrapper[5094]: E0220 07:53:03.357744 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="extract-utilities" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.357759 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="extract-utilities" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.358039 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a369096-f833-46f9-93c3-f05f985168c2" containerName="registry-server" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.362959 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.383320 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhkm8"] Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.481896 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-catalog-content\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.482019 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-utilities\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.482075 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxr9n\" (UniqueName: \"kubernetes.io/projected/966d704a-5474-4b23-b125-63789f45ee54-kube-api-access-fxr9n\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.583459 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-utilities\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.583554 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxr9n\" (UniqueName: \"kubernetes.io/projected/966d704a-5474-4b23-b125-63789f45ee54-kube-api-access-fxr9n\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.583632 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-catalog-content\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.584452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-catalog-content\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.584781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966d704a-5474-4b23-b125-63789f45ee54-utilities\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.614852 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxr9n\" (UniqueName: \"kubernetes.io/projected/966d704a-5474-4b23-b125-63789f45ee54-kube-api-access-fxr9n\") pod \"community-operators-bhkm8\" (UID: \"966d704a-5474-4b23-b125-63789f45ee54\") " pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:03 crc kubenswrapper[5094]: I0220 07:53:03.722961 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:04 crc kubenswrapper[5094]: I0220 07:53:04.107345 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:53:04 crc kubenswrapper[5094]: I0220 07:53:04.107977 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:53:04 crc kubenswrapper[5094]: I0220 07:53:04.243470 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhkm8"] Feb 20 07:53:04 crc kubenswrapper[5094]: I0220 07:53:04.295524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhkm8" event={"ID":"966d704a-5474-4b23-b125-63789f45ee54","Type":"ContainerStarted","Data":"c6a2d57b0006fd94772a9d020943c798981c6241465d4cdcc4eea15f613aac56"} Feb 20 07:53:05 crc kubenswrapper[5094]: I0220 07:53:05.310298 5094 generic.go:334] "Generic (PLEG): container finished" podID="966d704a-5474-4b23-b125-63789f45ee54" containerID="d65564fe78761429178c8f6287f3ce04ee576c655291e31bd89c9884855ac402" exitCode=0 Feb 20 07:53:05 crc kubenswrapper[5094]: I0220 07:53:05.310377 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhkm8" event={"ID":"966d704a-5474-4b23-b125-63789f45ee54","Type":"ContainerDied","Data":"d65564fe78761429178c8f6287f3ce04ee576c655291e31bd89c9884855ac402"} Feb 20 07:53:05 crc kubenswrapper[5094]: I0220 07:53:05.314612 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:53:10 crc kubenswrapper[5094]: I0220 07:53:10.361458 5094 generic.go:334] "Generic (PLEG): container finished" podID="966d704a-5474-4b23-b125-63789f45ee54" containerID="5a7511e6676c5ab50f40da25e1815cf9c50290e0aad0403c8709bb94e3b5cb03" exitCode=0 Feb 20 07:53:10 crc kubenswrapper[5094]: I0220 07:53:10.361598 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhkm8" event={"ID":"966d704a-5474-4b23-b125-63789f45ee54","Type":"ContainerDied","Data":"5a7511e6676c5ab50f40da25e1815cf9c50290e0aad0403c8709bb94e3b5cb03"} Feb 20 07:53:11 crc kubenswrapper[5094]: I0220 07:53:11.376182 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhkm8" event={"ID":"966d704a-5474-4b23-b125-63789f45ee54","Type":"ContainerStarted","Data":"d0e403d379777aa95fad81293d03def2b5729155fcd79be2b6e78032a7c9e16f"} Feb 20 07:53:11 crc kubenswrapper[5094]: I0220 07:53:11.407399 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bhkm8" podStartSLOduration=2.974705405 podStartE2EDuration="8.407368102s" podCreationTimestamp="2026-02-20 07:53:03 +0000 UTC" firstStartedPulling="2026-02-20 07:53:05.31410997 +0000 UTC m=+4000.186736721" lastFinishedPulling="2026-02-20 07:53:10.746772707 +0000 UTC m=+4005.619399418" observedRunningTime="2026-02-20 07:53:11.399414021 +0000 UTC m=+4006.272040742" watchObservedRunningTime="2026-02-20 07:53:11.407368102 +0000 UTC m=+4006.279994853" Feb 20 07:53:13 crc kubenswrapper[5094]: I0220 07:53:13.723604 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:13 crc kubenswrapper[5094]: I0220 07:53:13.725836 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:13 crc kubenswrapper[5094]: I0220 07:53:13.787640 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:23 crc kubenswrapper[5094]: I0220 07:53:23.779092 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bhkm8" Feb 20 07:53:23 crc kubenswrapper[5094]: I0220 07:53:23.903040 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhkm8"] Feb 20 07:53:23 crc kubenswrapper[5094]: I0220 07:53:23.981252 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 07:53:23 crc kubenswrapper[5094]: I0220 07:53:23.981975 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nhpxw" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="registry-server" containerID="cri-o://533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" gracePeriod=2 Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.470460 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497339 5094 generic.go:334] "Generic (PLEG): container finished" podID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerID="533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" exitCode=0 Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerDied","Data":"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd"} Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497415 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nhpxw" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497728 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nhpxw" event={"ID":"061991e0-0b0a-4e47-9275-e00b323e9fb2","Type":"ContainerDied","Data":"e9e27186df68fe530955f183de9f930dd6ce5b2ea9902cb61d24b627fd7e8b4f"} Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.497764 5094 scope.go:117] "RemoveContainer" containerID="533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.523523 5094 scope.go:117] "RemoveContainer" containerID="6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.558237 5094 scope.go:117] "RemoveContainer" containerID="258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.585575 5094 scope.go:117] "RemoveContainer" containerID="533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" Feb 20 07:53:24 crc kubenswrapper[5094]: E0220 07:53:24.586387 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd\": container with ID starting with 533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd not found: ID does not exist" containerID="533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.586422 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd"} err="failed to get container status \"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd\": rpc error: code = NotFound desc = could not find container \"533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd\": container with ID starting with 533d4a0d04fa804ad9211edefb6a91600dc963cb9cea7ea16b1aa22fe13ba2dd not found: ID does not exist" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.586447 5094 scope.go:117] "RemoveContainer" containerID="6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4" Feb 20 07:53:24 crc kubenswrapper[5094]: E0220 07:53:24.587205 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4\": container with ID starting with 6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4 not found: ID does not exist" containerID="6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.587243 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4"} err="failed to get container status \"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4\": rpc error: code = NotFound desc = could not find container \"6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4\": container with ID starting with 6c1deece1e9db9069d22aa8f5ae753fd3cc50686aa2d45d85821104ae79591b4 not found: ID does not exist" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.587258 5094 scope.go:117] "RemoveContainer" containerID="258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a" Feb 20 07:53:24 crc kubenswrapper[5094]: E0220 07:53:24.587772 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a\": container with ID starting with 258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a not found: ID does not exist" containerID="258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.587798 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a"} err="failed to get container status \"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a\": rpc error: code = NotFound desc = could not find container \"258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a\": container with ID starting with 258470f2010631a4f587bce056365561c7d5a4f1c012ad7808c46dac865f443a not found: ID does not exist" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.599577 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") pod \"061991e0-0b0a-4e47-9275-e00b323e9fb2\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.599662 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") pod \"061991e0-0b0a-4e47-9275-e00b323e9fb2\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.599693 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") pod \"061991e0-0b0a-4e47-9275-e00b323e9fb2\" (UID: \"061991e0-0b0a-4e47-9275-e00b323e9fb2\") " Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.600162 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities" (OuterVolumeSpecName: "utilities") pod "061991e0-0b0a-4e47-9275-e00b323e9fb2" (UID: "061991e0-0b0a-4e47-9275-e00b323e9fb2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.607213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j" (OuterVolumeSpecName: "kube-api-access-bmx6j") pod "061991e0-0b0a-4e47-9275-e00b323e9fb2" (UID: "061991e0-0b0a-4e47-9275-e00b323e9fb2"). InnerVolumeSpecName "kube-api-access-bmx6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.657279 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "061991e0-0b0a-4e47-9275-e00b323e9fb2" (UID: "061991e0-0b0a-4e47-9275-e00b323e9fb2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.701385 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.701427 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmx6j\" (UniqueName: \"kubernetes.io/projected/061991e0-0b0a-4e47-9275-e00b323e9fb2-kube-api-access-bmx6j\") on node \"crc\" DevicePath \"\"" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.701438 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/061991e0-0b0a-4e47-9275-e00b323e9fb2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.850799 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 07:53:24 crc kubenswrapper[5094]: I0220 07:53:24.878725 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nhpxw"] Feb 20 07:53:25 crc kubenswrapper[5094]: I0220 07:53:25.849079 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" path="/var/lib/kubelet/pods/061991e0-0b0a-4e47-9275-e00b323e9fb2/volumes" Feb 20 07:53:34 crc kubenswrapper[5094]: I0220 07:53:34.106752 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:53:34 crc kubenswrapper[5094]: I0220 07:53:34.107688 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.107134 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.108179 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.108252 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.109283 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.109350 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe" gracePeriod=600 Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.925471 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe" exitCode=0 Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.925589 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe"} Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.926135 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3"} Feb 20 07:54:04 crc kubenswrapper[5094]: I0220 07:54:04.926183 5094 scope.go:117] "RemoveContainer" containerID="5324ed8cea34533fd02551658fea318bc7a53cb8733f0b1035d8839a41a00949" Feb 20 07:56:04 crc kubenswrapper[5094]: I0220 07:56:04.106826 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:56:04 crc kubenswrapper[5094]: I0220 07:56:04.108072 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:56:34 crc kubenswrapper[5094]: I0220 07:56:34.107516 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:56:34 crc kubenswrapper[5094]: I0220 07:56:34.108597 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.107355 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.108585 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.108675 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.110116 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.110292 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" gracePeriod=600 Feb 20 07:57:04 crc kubenswrapper[5094]: E0220 07:57:04.242818 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.872584 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3"} Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.873222 5094 scope.go:117] "RemoveContainer" containerID="49c679c1e094a953959d220d84ae5c3290c008d2ae0e9a9a08ce980339bbcafe" Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.872511 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" exitCode=0 Feb 20 07:57:04 crc kubenswrapper[5094]: I0220 07:57:04.874214 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:04 crc kubenswrapper[5094]: E0220 07:57:04.874541 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:16 crc kubenswrapper[5094]: I0220 07:57:16.839647 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:16 crc kubenswrapper[5094]: E0220 07:57:16.840992 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:29 crc kubenswrapper[5094]: I0220 07:57:29.840507 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:29 crc kubenswrapper[5094]: E0220 07:57:29.841828 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:44 crc kubenswrapper[5094]: I0220 07:57:44.840329 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:44 crc kubenswrapper[5094]: E0220 07:57:44.841675 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:57:57 crc kubenswrapper[5094]: I0220 07:57:57.840113 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:57:57 crc kubenswrapper[5094]: E0220 07:57:57.842004 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:12 crc kubenswrapper[5094]: I0220 07:58:12.841277 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:12 crc kubenswrapper[5094]: E0220 07:58:12.842312 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:23 crc kubenswrapper[5094]: I0220 07:58:23.841219 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:23 crc kubenswrapper[5094]: E0220 07:58:23.842687 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:34 crc kubenswrapper[5094]: I0220 07:58:34.839970 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:34 crc kubenswrapper[5094]: E0220 07:58:34.840792 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.227808 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:58:39 crc kubenswrapper[5094]: E0220 07:58:39.228970 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="extract-utilities" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.228987 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="extract-utilities" Feb 20 07:58:39 crc kubenswrapper[5094]: E0220 07:58:39.228999 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="registry-server" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.229005 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="registry-server" Feb 20 07:58:39 crc kubenswrapper[5094]: E0220 07:58:39.229019 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="extract-content" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.229026 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="extract-content" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.229181 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="061991e0-0b0a-4e47-9275-e00b323e9fb2" containerName="registry-server" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.230288 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.289511 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.349305 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.349414 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.349727 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.451748 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.451904 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.451973 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.452484 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.452553 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.477316 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") pod \"redhat-operators-2pznd\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.550039 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:39 crc kubenswrapper[5094]: I0220 07:58:39.998342 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:58:40 crc kubenswrapper[5094]: I0220 07:58:40.028234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerStarted","Data":"16f2d279e9786bec6efd08bd1ca91aa319b918a099a927f356625ed09bf4f4dd"} Feb 20 07:58:41 crc kubenswrapper[5094]: I0220 07:58:41.038650 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerID="912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5" exitCode=0 Feb 20 07:58:41 crc kubenswrapper[5094]: I0220 07:58:41.039020 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerDied","Data":"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5"} Feb 20 07:58:41 crc kubenswrapper[5094]: I0220 07:58:41.041559 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 07:58:42 crc kubenswrapper[5094]: I0220 07:58:42.051603 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerStarted","Data":"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71"} Feb 20 07:58:43 crc kubenswrapper[5094]: I0220 07:58:43.065122 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerID="5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71" exitCode=0 Feb 20 07:58:43 crc kubenswrapper[5094]: I0220 07:58:43.065179 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerDied","Data":"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71"} Feb 20 07:58:44 crc kubenswrapper[5094]: I0220 07:58:44.078457 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerStarted","Data":"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578"} Feb 20 07:58:44 crc kubenswrapper[5094]: I0220 07:58:44.122104 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2pznd" podStartSLOduration=2.617357367 podStartE2EDuration="5.122073985s" podCreationTimestamp="2026-02-20 07:58:39 +0000 UTC" firstStartedPulling="2026-02-20 07:58:41.041261598 +0000 UTC m=+4335.913888319" lastFinishedPulling="2026-02-20 07:58:43.545978216 +0000 UTC m=+4338.418604937" observedRunningTime="2026-02-20 07:58:44.114749688 +0000 UTC m=+4338.987376429" watchObservedRunningTime="2026-02-20 07:58:44.122073985 +0000 UTC m=+4338.994700706" Feb 20 07:58:46 crc kubenswrapper[5094]: I0220 07:58:46.840860 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:46 crc kubenswrapper[5094]: E0220 07:58:46.842068 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:49 crc kubenswrapper[5094]: I0220 07:58:49.551077 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:49 crc kubenswrapper[5094]: I0220 07:58:49.551195 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:50 crc kubenswrapper[5094]: I0220 07:58:50.711693 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2pznd" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" probeResult="failure" output=< Feb 20 07:58:50 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 07:58:50 crc kubenswrapper[5094]: > Feb 20 07:58:57 crc kubenswrapper[5094]: I0220 07:58:57.840966 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:58:57 crc kubenswrapper[5094]: E0220 07:58:57.841979 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:58:59 crc kubenswrapper[5094]: I0220 07:58:59.617714 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:59 crc kubenswrapper[5094]: I0220 07:58:59.686619 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:58:59 crc kubenswrapper[5094]: I0220 07:58:59.866491 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.235197 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2pznd" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" containerID="cri-o://aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" gracePeriod=2 Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.715380 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.897495 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") pod \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.897587 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") pod \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.897852 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") pod \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\" (UID: \"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e\") " Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.900087 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities" (OuterVolumeSpecName: "utilities") pod "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" (UID: "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:59:01 crc kubenswrapper[5094]: I0220 07:59:01.909970 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp" (OuterVolumeSpecName: "kube-api-access-4hxrp") pod "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" (UID: "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e"). InnerVolumeSpecName "kube-api-access-4hxrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.001501 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hxrp\" (UniqueName: \"kubernetes.io/projected/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-kube-api-access-4hxrp\") on node \"crc\" DevicePath \"\"" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.001559 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.086624 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" (UID: "a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.103806 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247541 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerID="aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" exitCode=0 Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247619 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerDied","Data":"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578"} Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247667 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2pznd" event={"ID":"a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e","Type":"ContainerDied","Data":"16f2d279e9786bec6efd08bd1ca91aa319b918a099a927f356625ed09bf4f4dd"} Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247700 5094 scope.go:117] "RemoveContainer" containerID="aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.247968 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2pznd" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.286024 5094 scope.go:117] "RemoveContainer" containerID="5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.330209 5094 scope.go:117] "RemoveContainer" containerID="912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.376308 5094 scope.go:117] "RemoveContainer" containerID="aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" Feb 20 07:59:02 crc kubenswrapper[5094]: E0220 07:59:02.377038 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578\": container with ID starting with aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578 not found: ID does not exist" containerID="aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.377096 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578"} err="failed to get container status \"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578\": rpc error: code = NotFound desc = could not find container \"aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578\": container with ID starting with aab55074e5221dd1c8ddf427d82131e85131ebf60fd7fdd873cd344040c97578 not found: ID does not exist" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.377133 5094 scope.go:117] "RemoveContainer" containerID="5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71" Feb 20 07:59:02 crc kubenswrapper[5094]: E0220 07:59:02.377621 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71\": container with ID starting with 5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71 not found: ID does not exist" containerID="5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.377734 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71"} err="failed to get container status \"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71\": rpc error: code = NotFound desc = could not find container \"5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71\": container with ID starting with 5635d74746c33fb77085e391ba1f7e2baf7f1505db9afc7521e44e6048577c71 not found: ID does not exist" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.377793 5094 scope.go:117] "RemoveContainer" containerID="912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5" Feb 20 07:59:02 crc kubenswrapper[5094]: E0220 07:59:02.378152 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5\": container with ID starting with 912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5 not found: ID does not exist" containerID="912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.378185 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5"} err="failed to get container status \"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5\": rpc error: code = NotFound desc = could not find container \"912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5\": container with ID starting with 912712af09fbe2023dacdfbc58c58b77b5b02d464783c137acbcdacf597a91f5 not found: ID does not exist" Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.382223 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:59:02 crc kubenswrapper[5094]: I0220 07:59:02.391951 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2pznd"] Feb 20 07:59:03 crc kubenswrapper[5094]: I0220 07:59:03.853825 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" path="/var/lib/kubelet/pods/a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e/volumes" Feb 20 07:59:08 crc kubenswrapper[5094]: I0220 07:59:08.840690 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:59:08 crc kubenswrapper[5094]: E0220 07:59:08.841467 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:59:23 crc kubenswrapper[5094]: I0220 07:59:23.840777 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:59:23 crc kubenswrapper[5094]: E0220 07:59:23.841991 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:59:36 crc kubenswrapper[5094]: I0220 07:59:36.840978 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:59:36 crc kubenswrapper[5094]: E0220 07:59:36.842476 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 07:59:48 crc kubenswrapper[5094]: I0220 07:59:48.841157 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 07:59:48 crc kubenswrapper[5094]: E0220 07:59:48.842760 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.221175 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:00:00 crc kubenswrapper[5094]: E0220 08:00:00.222379 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="extract-utilities" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.222441 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="extract-utilities" Feb 20 08:00:00 crc kubenswrapper[5094]: E0220 08:00:00.222493 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.222503 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" Feb 20 08:00:00 crc kubenswrapper[5094]: E0220 08:00:00.222529 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="extract-content" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.222537 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="extract-content" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.223042 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7ca8c12-25da-428f-a1f0-e1e5dd6fca5e" containerName="registry-server" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.223858 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.229629 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.239375 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.243694 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.349753 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.350410 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.350541 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.452411 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.452487 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.452604 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.453947 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.467219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.472747 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") pod \"collect-profiles-29526240-s96td\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:00 crc kubenswrapper[5094]: I0220 08:00:00.547393 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:01 crc kubenswrapper[5094]: I0220 08:00:01.074483 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:00:01 crc kubenswrapper[5094]: I0220 08:00:01.868253 5094 generic.go:334] "Generic (PLEG): container finished" podID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" containerID="a7f01ab3dfebce16c461640e15ab5cb83ed76e8a8bf4b49d9de590c4cb6aacd4" exitCode=0 Feb 20 08:00:01 crc kubenswrapper[5094]: I0220 08:00:01.868357 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" event={"ID":"a036c1c3-0425-4a2e-a42d-2abfcdc49620","Type":"ContainerDied","Data":"a7f01ab3dfebce16c461640e15ab5cb83ed76e8a8bf4b49d9de590c4cb6aacd4"} Feb 20 08:00:01 crc kubenswrapper[5094]: I0220 08:00:01.868942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" event={"ID":"a036c1c3-0425-4a2e-a42d-2abfcdc49620","Type":"ContainerStarted","Data":"92793db5d7f411266cc1b360d23e2052740b7de803bbc4e041ccb1c325e0851a"} Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.318167 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.407191 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") pod \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.407405 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") pod \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.407474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") pod \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\" (UID: \"a036c1c3-0425-4a2e-a42d-2abfcdc49620\") " Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.409233 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume" (OuterVolumeSpecName: "config-volume") pod "a036c1c3-0425-4a2e-a42d-2abfcdc49620" (UID: "a036c1c3-0425-4a2e-a42d-2abfcdc49620"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.417056 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6" (OuterVolumeSpecName: "kube-api-access-gdrm6") pod "a036c1c3-0425-4a2e-a42d-2abfcdc49620" (UID: "a036c1c3-0425-4a2e-a42d-2abfcdc49620"). InnerVolumeSpecName "kube-api-access-gdrm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.417927 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a036c1c3-0425-4a2e-a42d-2abfcdc49620" (UID: "a036c1c3-0425-4a2e-a42d-2abfcdc49620"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.510386 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a036c1c3-0425-4a2e-a42d-2abfcdc49620-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.510463 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdrm6\" (UniqueName: \"kubernetes.io/projected/a036c1c3-0425-4a2e-a42d-2abfcdc49620-kube-api-access-gdrm6\") on node \"crc\" DevicePath \"\"" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.510490 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a036c1c3-0425-4a2e-a42d-2abfcdc49620-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.840584 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:03 crc kubenswrapper[5094]: E0220 08:00:03.841795 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.889086 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" event={"ID":"a036c1c3-0425-4a2e-a42d-2abfcdc49620","Type":"ContainerDied","Data":"92793db5d7f411266cc1b360d23e2052740b7de803bbc4e041ccb1c325e0851a"} Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.889195 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92793db5d7f411266cc1b360d23e2052740b7de803bbc4e041ccb1c325e0851a" Feb 20 08:00:03 crc kubenswrapper[5094]: I0220 08:00:03.889204 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td" Feb 20 08:00:04 crc kubenswrapper[5094]: I0220 08:00:04.431407 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 08:00:04 crc kubenswrapper[5094]: I0220 08:00:04.438887 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526195-mm4vs"] Feb 20 08:00:05 crc kubenswrapper[5094]: I0220 08:00:05.864822 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74085586-b345-46e6-9367-d3b5243312a4" path="/var/lib/kubelet/pods/74085586-b345-46e6-9367-d3b5243312a4/volumes" Feb 20 08:00:14 crc kubenswrapper[5094]: I0220 08:00:14.841933 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:14 crc kubenswrapper[5094]: E0220 08:00:14.843269 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:28 crc kubenswrapper[5094]: I0220 08:00:28.840956 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:28 crc kubenswrapper[5094]: E0220 08:00:28.842245 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:35 crc kubenswrapper[5094]: I0220 08:00:35.712025 5094 scope.go:117] "RemoveContainer" containerID="c476764b7efa3f5032f50b9bb9a9da3e47b6999aab2c3a50d607379d62e94686" Feb 20 08:00:43 crc kubenswrapper[5094]: I0220 08:00:43.841329 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:43 crc kubenswrapper[5094]: E0220 08:00:43.842753 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:00:55 crc kubenswrapper[5094]: I0220 08:00:55.855438 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:00:55 crc kubenswrapper[5094]: E0220 08:00:55.856944 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:01:07 crc kubenswrapper[5094]: I0220 08:01:07.848793 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:01:07 crc kubenswrapper[5094]: E0220 08:01:07.849856 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:01:22 crc kubenswrapper[5094]: I0220 08:01:22.840837 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:01:22 crc kubenswrapper[5094]: E0220 08:01:22.842371 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:01:36 crc kubenswrapper[5094]: I0220 08:01:36.844091 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:01:36 crc kubenswrapper[5094]: E0220 08:01:36.845553 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:01:51 crc kubenswrapper[5094]: I0220 08:01:51.842378 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:01:51 crc kubenswrapper[5094]: E0220 08:01:51.843624 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:02:04 crc kubenswrapper[5094]: I0220 08:02:04.840430 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:02:06 crc kubenswrapper[5094]: I0220 08:02:06.175013 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759"} Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.017586 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:07 crc kubenswrapper[5094]: E0220 08:02:07.019263 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" containerName="collect-profiles" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.019297 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" containerName="collect-profiles" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.021199 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" containerName="collect-profiles" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.025982 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.039187 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.141386 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.141547 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.141775 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.243269 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.243380 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.243415 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.244012 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.244251 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.267133 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") pod \"certified-operators-jcqhm\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.385309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:07 crc kubenswrapper[5094]: I0220 08:02:07.849043 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:08 crc kubenswrapper[5094]: I0220 08:02:08.194278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerStarted","Data":"6e07502c565034b70d7e1855695af0e1c2b2ffe84d0bc5fe019ad94a17c59ced"} Feb 20 08:02:09 crc kubenswrapper[5094]: I0220 08:02:09.209365 5094 generic.go:334] "Generic (PLEG): container finished" podID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerID="16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e" exitCode=0 Feb 20 08:02:09 crc kubenswrapper[5094]: I0220 08:02:09.209443 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerDied","Data":"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e"} Feb 20 08:02:10 crc kubenswrapper[5094]: I0220 08:02:10.225770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerStarted","Data":"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c"} Feb 20 08:02:11 crc kubenswrapper[5094]: I0220 08:02:11.242693 5094 generic.go:334] "Generic (PLEG): container finished" podID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerID="0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c" exitCode=0 Feb 20 08:02:11 crc kubenswrapper[5094]: I0220 08:02:11.242970 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerDied","Data":"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c"} Feb 20 08:02:12 crc kubenswrapper[5094]: I0220 08:02:12.256579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerStarted","Data":"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0"} Feb 20 08:02:12 crc kubenswrapper[5094]: I0220 08:02:12.298802 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jcqhm" podStartSLOduration=3.8258306490000002 podStartE2EDuration="6.298773814s" podCreationTimestamp="2026-02-20 08:02:06 +0000 UTC" firstStartedPulling="2026-02-20 08:02:09.213078202 +0000 UTC m=+4544.085704913" lastFinishedPulling="2026-02-20 08:02:11.686021327 +0000 UTC m=+4546.558648078" observedRunningTime="2026-02-20 08:02:12.292003552 +0000 UTC m=+4547.164630263" watchObservedRunningTime="2026-02-20 08:02:12.298773814 +0000 UTC m=+4547.171400565" Feb 20 08:02:17 crc kubenswrapper[5094]: I0220 08:02:17.386175 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:17 crc kubenswrapper[5094]: I0220 08:02:17.387452 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:17 crc kubenswrapper[5094]: I0220 08:02:17.447385 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:18 crc kubenswrapper[5094]: I0220 08:02:18.374632 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:18 crc kubenswrapper[5094]: I0220 08:02:18.446451 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.327662 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jcqhm" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="registry-server" containerID="cri-o://783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" gracePeriod=2 Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.847603 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.930225 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") pod \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.930496 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") pod \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.930674 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") pod \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\" (UID: \"f83e717d-9073-4ecb-8aa5-f35d5fd35a84\") " Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.932346 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities" (OuterVolumeSpecName: "utilities") pod "f83e717d-9073-4ecb-8aa5-f35d5fd35a84" (UID: "f83e717d-9073-4ecb-8aa5-f35d5fd35a84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:02:20 crc kubenswrapper[5094]: I0220 08:02:20.945976 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d" (OuterVolumeSpecName: "kube-api-access-m4w9d") pod "f83e717d-9073-4ecb-8aa5-f35d5fd35a84" (UID: "f83e717d-9073-4ecb-8aa5-f35d5fd35a84"). InnerVolumeSpecName "kube-api-access-m4w9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.032905 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4w9d\" (UniqueName: \"kubernetes.io/projected/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-kube-api-access-m4w9d\") on node \"crc\" DevicePath \"\"" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.032992 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.066396 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f83e717d-9073-4ecb-8aa5-f35d5fd35a84" (UID: "f83e717d-9073-4ecb-8aa5-f35d5fd35a84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.134893 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f83e717d-9073-4ecb-8aa5-f35d5fd35a84-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.340575 5094 generic.go:334] "Generic (PLEG): container finished" podID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerID="783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" exitCode=0 Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.340729 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jcqhm" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.340738 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerDied","Data":"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0"} Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.341386 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jcqhm" event={"ID":"f83e717d-9073-4ecb-8aa5-f35d5fd35a84","Type":"ContainerDied","Data":"6e07502c565034b70d7e1855695af0e1c2b2ffe84d0bc5fe019ad94a17c59ced"} Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.341460 5094 scope.go:117] "RemoveContainer" containerID="783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.374840 5094 scope.go:117] "RemoveContainer" containerID="0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.410741 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.444355 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jcqhm"] Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.453775 5094 scope.go:117] "RemoveContainer" containerID="16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.485027 5094 scope.go:117] "RemoveContainer" containerID="783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" Feb 20 08:02:21 crc kubenswrapper[5094]: E0220 08:02:21.485663 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0\": container with ID starting with 783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0 not found: ID does not exist" containerID="783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.485747 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0"} err="failed to get container status \"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0\": rpc error: code = NotFound desc = could not find container \"783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0\": container with ID starting with 783be4d3bd6dce8d541b1375204eb35de74673ef9933fe9fd4a53daf34c521f0 not found: ID does not exist" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.485792 5094 scope.go:117] "RemoveContainer" containerID="0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c" Feb 20 08:02:21 crc kubenswrapper[5094]: E0220 08:02:21.486651 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c\": container with ID starting with 0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c not found: ID does not exist" containerID="0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.486688 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c"} err="failed to get container status \"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c\": rpc error: code = NotFound desc = could not find container \"0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c\": container with ID starting with 0b54eb92842ab8707757b9ef33330d6ade06cc1159dcf0b42f9dac90b063266c not found: ID does not exist" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.486741 5094 scope.go:117] "RemoveContainer" containerID="16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e" Feb 20 08:02:21 crc kubenswrapper[5094]: E0220 08:02:21.487140 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e\": container with ID starting with 16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e not found: ID does not exist" containerID="16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.487190 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e"} err="failed to get container status \"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e\": rpc error: code = NotFound desc = could not find container \"16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e\": container with ID starting with 16ee1893805d7837ff22d51e6959a2a190d238d7646340ed7c902aec8b92e75e not found: ID does not exist" Feb 20 08:02:21 crc kubenswrapper[5094]: I0220 08:02:21.856356 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" path="/var/lib/kubelet/pods/f83e717d-9073-4ecb-8aa5-f35d5fd35a84/volumes" Feb 20 08:04:34 crc kubenswrapper[5094]: I0220 08:04:34.106865 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:04:34 crc kubenswrapper[5094]: I0220 08:04:34.107927 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:05:04 crc kubenswrapper[5094]: I0220 08:05:04.107598 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:05:04 crc kubenswrapper[5094]: I0220 08:05:04.108675 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.605179 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:18 crc kubenswrapper[5094]: E0220 08:05:18.606679 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="registry-server" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.606735 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="registry-server" Feb 20 08:05:18 crc kubenswrapper[5094]: E0220 08:05:18.606772 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="extract-utilities" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.606787 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="extract-utilities" Feb 20 08:05:18 crc kubenswrapper[5094]: E0220 08:05:18.606834 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="extract-content" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.606848 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="extract-content" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.607107 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f83e717d-9073-4ecb-8aa5-f35d5fd35a84" containerName="registry-server" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.609052 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.639883 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.758767 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.758861 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.758908 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.860961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.861220 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.861282 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.861890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.861890 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.894089 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") pod \"community-operators-9vn8k\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:18 crc kubenswrapper[5094]: I0220 08:05:18.932016 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.237634 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.595292 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.597087 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.619303 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.676868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.676957 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.676998 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.778968 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.779059 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.779098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.779671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.779903 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.803222 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") pod \"redhat-marketplace-kcglh\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:19 crc kubenswrapper[5094]: I0220 08:05:19.927578 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.078948 5094 generic.go:334] "Generic (PLEG): container finished" podID="bb372958-7c69-465a-b777-030494eb246a" containerID="6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d" exitCode=0 Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.079074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerDied","Data":"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d"} Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.079466 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerStarted","Data":"8430f9b319dadd2c5b9591487ab581b43a2994a86d12a939297ba853574c3fec"} Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.082107 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:05:20 crc kubenswrapper[5094]: I0220 08:05:20.170910 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.087481 5094 generic.go:334] "Generic (PLEG): container finished" podID="bb372958-7c69-465a-b777-030494eb246a" containerID="d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82" exitCode=0 Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.087580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerDied","Data":"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82"} Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.089733 5094 generic.go:334] "Generic (PLEG): container finished" podID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerID="39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87" exitCode=0 Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.089792 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerDied","Data":"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87"} Feb 20 08:05:21 crc kubenswrapper[5094]: I0220 08:05:21.089830 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerStarted","Data":"26632a487f68eadc734723e4639ec5fdf159933d393e3a4066d8d387504c2ce7"} Feb 20 08:05:22 crc kubenswrapper[5094]: I0220 08:05:22.101810 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerStarted","Data":"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc"} Feb 20 08:05:22 crc kubenswrapper[5094]: I0220 08:05:22.122517 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9vn8k" podStartSLOduration=2.69536786 podStartE2EDuration="4.122490642s" podCreationTimestamp="2026-02-20 08:05:18 +0000 UTC" firstStartedPulling="2026-02-20 08:05:20.081519721 +0000 UTC m=+4734.954146442" lastFinishedPulling="2026-02-20 08:05:21.508642503 +0000 UTC m=+4736.381269224" observedRunningTime="2026-02-20 08:05:22.12029461 +0000 UTC m=+4736.992921331" watchObservedRunningTime="2026-02-20 08:05:22.122490642 +0000 UTC m=+4736.995117353" Feb 20 08:05:23 crc kubenswrapper[5094]: I0220 08:05:23.112290 5094 generic.go:334] "Generic (PLEG): container finished" podID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerID="ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4" exitCode=0 Feb 20 08:05:23 crc kubenswrapper[5094]: I0220 08:05:23.112362 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerDied","Data":"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4"} Feb 20 08:05:24 crc kubenswrapper[5094]: I0220 08:05:24.125568 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerStarted","Data":"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89"} Feb 20 08:05:24 crc kubenswrapper[5094]: I0220 08:05:24.155643 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kcglh" podStartSLOduration=2.779211085 podStartE2EDuration="5.155618426s" podCreationTimestamp="2026-02-20 08:05:19 +0000 UTC" firstStartedPulling="2026-02-20 08:05:21.091479504 +0000 UTC m=+4735.964106215" lastFinishedPulling="2026-02-20 08:05:23.467886845 +0000 UTC m=+4738.340513556" observedRunningTime="2026-02-20 08:05:24.151553459 +0000 UTC m=+4739.024180170" watchObservedRunningTime="2026-02-20 08:05:24.155618426 +0000 UTC m=+4739.028245137" Feb 20 08:05:28 crc kubenswrapper[5094]: I0220 08:05:28.932340 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:28 crc kubenswrapper[5094]: I0220 08:05:28.933846 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.101305 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.242216 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.345618 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.928458 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.928522 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:29 crc kubenswrapper[5094]: I0220 08:05:29.969215 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:30 crc kubenswrapper[5094]: I0220 08:05:30.241185 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.197052 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9vn8k" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="registry-server" containerID="cri-o://c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" gracePeriod=2 Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.637031 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.727587 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") pod \"bb372958-7c69-465a-b777-030494eb246a\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.727659 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") pod \"bb372958-7c69-465a-b777-030494eb246a\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.727727 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") pod \"bb372958-7c69-465a-b777-030494eb246a\" (UID: \"bb372958-7c69-465a-b777-030494eb246a\") " Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.728792 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities" (OuterVolumeSpecName: "utilities") pod "bb372958-7c69-465a-b777-030494eb246a" (UID: "bb372958-7c69-465a-b777-030494eb246a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.734733 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z" (OuterVolumeSpecName: "kube-api-access-xkl9z") pod "bb372958-7c69-465a-b777-030494eb246a" (UID: "bb372958-7c69-465a-b777-030494eb246a"). InnerVolumeSpecName "kube-api-access-xkl9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.743568 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.786139 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb372958-7c69-465a-b777-030494eb246a" (UID: "bb372958-7c69-465a-b777-030494eb246a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.829460 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkl9z\" (UniqueName: \"kubernetes.io/projected/bb372958-7c69-465a-b777-030494eb246a-kube-api-access-xkl9z\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.829491 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:31 crc kubenswrapper[5094]: I0220 08:05:31.829501 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb372958-7c69-465a-b777-030494eb246a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.212750 5094 generic.go:334] "Generic (PLEG): container finished" podID="bb372958-7c69-465a-b777-030494eb246a" containerID="c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" exitCode=0 Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.213353 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kcglh" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="registry-server" containerID="cri-o://3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" gracePeriod=2 Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.214639 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9vn8k" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.217042 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerDied","Data":"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc"} Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.217218 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9vn8k" event={"ID":"bb372958-7c69-465a-b777-030494eb246a","Type":"ContainerDied","Data":"8430f9b319dadd2c5b9591487ab581b43a2994a86d12a939297ba853574c3fec"} Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.217348 5094 scope.go:117] "RemoveContainer" containerID="c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.255556 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.263624 5094 scope.go:117] "RemoveContainer" containerID="d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.265337 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9vn8k"] Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.306255 5094 scope.go:117] "RemoveContainer" containerID="6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.435831 5094 scope.go:117] "RemoveContainer" containerID="c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" Feb 20 08:05:32 crc kubenswrapper[5094]: E0220 08:05:32.436316 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc\": container with ID starting with c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc not found: ID does not exist" containerID="c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.436352 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc"} err="failed to get container status \"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc\": rpc error: code = NotFound desc = could not find container \"c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc\": container with ID starting with c2b6325efe24dafb298dfe2d3c8a64310b52d0692d4c59b44309636582f6a2dc not found: ID does not exist" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.436376 5094 scope.go:117] "RemoveContainer" containerID="d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82" Feb 20 08:05:32 crc kubenswrapper[5094]: E0220 08:05:32.436762 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82\": container with ID starting with d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82 not found: ID does not exist" containerID="d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.436837 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82"} err="failed to get container status \"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82\": rpc error: code = NotFound desc = could not find container \"d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82\": container with ID starting with d77fadfe78db435b2b3d653940751a2de236dcd2a907d33c80b2992b4c327f82 not found: ID does not exist" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.436879 5094 scope.go:117] "RemoveContainer" containerID="6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d" Feb 20 08:05:32 crc kubenswrapper[5094]: E0220 08:05:32.437214 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d\": container with ID starting with 6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d not found: ID does not exist" containerID="6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.437238 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d"} err="failed to get container status \"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d\": rpc error: code = NotFound desc = could not find container \"6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d\": container with ID starting with 6ab9cf4c1a2baf580d75234cf709a4e40b91236d935360bf5b74d8752da9972d not found: ID does not exist" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.692883 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.846538 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") pod \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.846764 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") pod \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.846861 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") pod \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\" (UID: \"48cd9c77-9518-4f8d-aae6-01f8cc109bbd\") " Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.847940 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities" (OuterVolumeSpecName: "utilities") pod "48cd9c77-9518-4f8d-aae6-01f8cc109bbd" (UID: "48cd9c77-9518-4f8d-aae6-01f8cc109bbd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.852309 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww" (OuterVolumeSpecName: "kube-api-access-gtpww") pod "48cd9c77-9518-4f8d-aae6-01f8cc109bbd" (UID: "48cd9c77-9518-4f8d-aae6-01f8cc109bbd"). InnerVolumeSpecName "kube-api-access-gtpww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.897493 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48cd9c77-9518-4f8d-aae6-01f8cc109bbd" (UID: "48cd9c77-9518-4f8d-aae6-01f8cc109bbd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.949318 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.949365 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:32 crc kubenswrapper[5094]: I0220 08:05:32.949377 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtpww\" (UniqueName: \"kubernetes.io/projected/48cd9c77-9518-4f8d-aae6-01f8cc109bbd-kube-api-access-gtpww\") on node \"crc\" DevicePath \"\"" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.224970 5094 generic.go:334] "Generic (PLEG): container finished" podID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerID="3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" exitCode=0 Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.225057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerDied","Data":"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89"} Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.225137 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcglh" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.225475 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcglh" event={"ID":"48cd9c77-9518-4f8d-aae6-01f8cc109bbd","Type":"ContainerDied","Data":"26632a487f68eadc734723e4639ec5fdf159933d393e3a4066d8d387504c2ce7"} Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.225510 5094 scope.go:117] "RemoveContainer" containerID="3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.254764 5094 scope.go:117] "RemoveContainer" containerID="ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.281568 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.288852 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcglh"] Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.298388 5094 scope.go:117] "RemoveContainer" containerID="39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.317128 5094 scope.go:117] "RemoveContainer" containerID="3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" Feb 20 08:05:33 crc kubenswrapper[5094]: E0220 08:05:33.317688 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89\": container with ID starting with 3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89 not found: ID does not exist" containerID="3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.317907 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89"} err="failed to get container status \"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89\": rpc error: code = NotFound desc = could not find container \"3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89\": container with ID starting with 3ac4718c43efbb097b56b4b12b6d0e314180adc5c8f34b06db43bf5faf77ef89 not found: ID does not exist" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.317935 5094 scope.go:117] "RemoveContainer" containerID="ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4" Feb 20 08:05:33 crc kubenswrapper[5094]: E0220 08:05:33.318255 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4\": container with ID starting with ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4 not found: ID does not exist" containerID="ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.318292 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4"} err="failed to get container status \"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4\": rpc error: code = NotFound desc = could not find container \"ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4\": container with ID starting with ac45ad5d642dccb5f31fa01005a4c82e37caefb785d934d09d4bda31f8e409f4 not found: ID does not exist" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.318313 5094 scope.go:117] "RemoveContainer" containerID="39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87" Feb 20 08:05:33 crc kubenswrapper[5094]: E0220 08:05:33.318661 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87\": container with ID starting with 39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87 not found: ID does not exist" containerID="39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.318727 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87"} err="failed to get container status \"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87\": rpc error: code = NotFound desc = could not find container \"39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87\": container with ID starting with 39940c6fc64d5148269f51845fceb3dc2cd3b5c97786ee2f0e3d03e6fab57c87 not found: ID does not exist" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.850765 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" path="/var/lib/kubelet/pods/48cd9c77-9518-4f8d-aae6-01f8cc109bbd/volumes" Feb 20 08:05:33 crc kubenswrapper[5094]: I0220 08:05:33.852468 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb372958-7c69-465a-b777-030494eb246a" path="/var/lib/kubelet/pods/bb372958-7c69-465a-b777-030494eb246a/volumes" Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.107389 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.107500 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.107580 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.108646 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.108776 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759" gracePeriod=600 Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.241001 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759" exitCode=0 Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.241114 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759"} Feb 20 08:05:34 crc kubenswrapper[5094]: I0220 08:05:34.241176 5094 scope.go:117] "RemoveContainer" containerID="a4c5648e7039b505ffb31956fcd82fb9c747f89daa0d4e0c047d16ed3fd223c3" Feb 20 08:05:35 crc kubenswrapper[5094]: I0220 08:05:35.255650 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37"} Feb 20 08:07:34 crc kubenswrapper[5094]: I0220 08:07:34.107145 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:07:34 crc kubenswrapper[5094]: I0220 08:07:34.107886 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:08:04 crc kubenswrapper[5094]: I0220 08:08:04.107499 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:08:04 crc kubenswrapper[5094]: I0220 08:08:04.108333 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.107321 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.108019 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.108084 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.108910 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.109009 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" gracePeriod=600 Feb 20 08:08:34 crc kubenswrapper[5094]: E0220 08:08:34.246669 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.964624 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" exitCode=0 Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.964680 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37"} Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.964748 5094 scope.go:117] "RemoveContainer" containerID="3143f70134d8a8406c6c494913fe42a5323bf1a21b5d094b1b86262c618b9759" Feb 20 08:08:34 crc kubenswrapper[5094]: I0220 08:08:34.965327 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:08:34 crc kubenswrapper[5094]: E0220 08:08:34.965695 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658071 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658841 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="extract-content" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658860 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="extract-content" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658878 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658887 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658905 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="extract-utilities" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658914 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="extract-utilities" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658937 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="extract-utilities" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658944 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="extract-utilities" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658959 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658967 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: E0220 08:08:40.658979 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="extract-content" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.658987 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="extract-content" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.659177 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb372958-7c69-465a-b777-030494eb246a" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.659196 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cd9c77-9518-4f8d-aae6-01f8cc109bbd" containerName="registry-server" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.660464 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.664641 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.729276 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.729344 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.729405 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.830494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.830581 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.830648 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.831128 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.831201 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.851649 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") pod \"redhat-operators-4wfxd\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:40 crc kubenswrapper[5094]: I0220 08:08:40.995430 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:41 crc kubenswrapper[5094]: I0220 08:08:41.403127 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:08:42 crc kubenswrapper[5094]: I0220 08:08:42.017146 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerID="ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145" exitCode=0 Feb 20 08:08:42 crc kubenswrapper[5094]: I0220 08:08:42.017241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerDied","Data":"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145"} Feb 20 08:08:42 crc kubenswrapper[5094]: I0220 08:08:42.017462 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerStarted","Data":"ba3b2230d07fa163b3aed23f4cdc1e6ce956e10bfc139d6a0576b073cc675fdf"} Feb 20 08:08:43 crc kubenswrapper[5094]: I0220 08:08:43.026314 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerStarted","Data":"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c"} Feb 20 08:08:44 crc kubenswrapper[5094]: I0220 08:08:44.037764 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerID="2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c" exitCode=0 Feb 20 08:08:44 crc kubenswrapper[5094]: I0220 08:08:44.037869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerDied","Data":"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c"} Feb 20 08:08:45 crc kubenswrapper[5094]: I0220 08:08:45.049632 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerStarted","Data":"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2"} Feb 20 08:08:47 crc kubenswrapper[5094]: I0220 08:08:47.841087 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:08:47 crc kubenswrapper[5094]: E0220 08:08:47.841641 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:08:50 crc kubenswrapper[5094]: I0220 08:08:50.996204 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:50 crc kubenswrapper[5094]: I0220 08:08:50.997826 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:08:52 crc kubenswrapper[5094]: I0220 08:08:52.041085 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4wfxd" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" probeResult="failure" output=< Feb 20 08:08:52 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:08:52 crc kubenswrapper[5094]: > Feb 20 08:09:01 crc kubenswrapper[5094]: I0220 08:09:01.068024 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:09:01 crc kubenswrapper[5094]: I0220 08:09:01.101259 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4wfxd" podStartSLOduration=18.420452936 podStartE2EDuration="21.101217469s" podCreationTimestamp="2026-02-20 08:08:40 +0000 UTC" firstStartedPulling="2026-02-20 08:08:42.018824699 +0000 UTC m=+4936.891451410" lastFinishedPulling="2026-02-20 08:08:44.699589202 +0000 UTC m=+4939.572215943" observedRunningTime="2026-02-20 08:08:45.071679719 +0000 UTC m=+4939.944306450" watchObservedRunningTime="2026-02-20 08:09:01.101217469 +0000 UTC m=+4955.973844180" Feb 20 08:09:01 crc kubenswrapper[5094]: I0220 08:09:01.127481 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:09:01 crc kubenswrapper[5094]: I0220 08:09:01.303737 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.168124 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4wfxd" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" containerID="cri-o://cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" gracePeriod=2 Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.532881 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.653199 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") pod \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.653281 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") pod \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.653347 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") pod \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\" (UID: \"d7113bba-5e76-4cec-86ae-8b8f25962f3b\") " Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.654435 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities" (OuterVolumeSpecName: "utilities") pod "d7113bba-5e76-4cec-86ae-8b8f25962f3b" (UID: "d7113bba-5e76-4cec-86ae-8b8f25962f3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.659291 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x" (OuterVolumeSpecName: "kube-api-access-dww2x") pod "d7113bba-5e76-4cec-86ae-8b8f25962f3b" (UID: "d7113bba-5e76-4cec-86ae-8b8f25962f3b"). InnerVolumeSpecName "kube-api-access-dww2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.756974 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dww2x\" (UniqueName: \"kubernetes.io/projected/d7113bba-5e76-4cec-86ae-8b8f25962f3b-kube-api-access-dww2x\") on node \"crc\" DevicePath \"\"" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.757018 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.798316 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7113bba-5e76-4cec-86ae-8b8f25962f3b" (UID: "d7113bba-5e76-4cec-86ae-8b8f25962f3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.840497 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:02 crc kubenswrapper[5094]: E0220 08:09:02.840914 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:09:02 crc kubenswrapper[5094]: I0220 08:09:02.858392 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7113bba-5e76-4cec-86ae-8b8f25962f3b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188371 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerID="cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" exitCode=0 Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188440 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerDied","Data":"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2"} Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188483 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4wfxd" event={"ID":"d7113bba-5e76-4cec-86ae-8b8f25962f3b","Type":"ContainerDied","Data":"ba3b2230d07fa163b3aed23f4cdc1e6ce956e10bfc139d6a0576b073cc675fdf"} Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188516 5094 scope.go:117] "RemoveContainer" containerID="cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.188521 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4wfxd" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.217058 5094 scope.go:117] "RemoveContainer" containerID="2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.235240 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.241433 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4wfxd"] Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.361617 5094 scope.go:117] "RemoveContainer" containerID="ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.385389 5094 scope.go:117] "RemoveContainer" containerID="cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" Feb 20 08:09:03 crc kubenswrapper[5094]: E0220 08:09:03.385926 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2\": container with ID starting with cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2 not found: ID does not exist" containerID="cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.385961 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2"} err="failed to get container status \"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2\": rpc error: code = NotFound desc = could not find container \"cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2\": container with ID starting with cb1f0e86fa2d34c2c4dbdd909edb855c5e32410b104d7b14b39d10bbdc8eb2c2 not found: ID does not exist" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.385983 5094 scope.go:117] "RemoveContainer" containerID="2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c" Feb 20 08:09:03 crc kubenswrapper[5094]: E0220 08:09:03.386377 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c\": container with ID starting with 2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c not found: ID does not exist" containerID="2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.386428 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c"} err="failed to get container status \"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c\": rpc error: code = NotFound desc = could not find container \"2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c\": container with ID starting with 2975c7d76bb965432618b15285614568496609a55569ca90adeefbf8ba05ea8c not found: ID does not exist" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.386462 5094 scope.go:117] "RemoveContainer" containerID="ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145" Feb 20 08:09:03 crc kubenswrapper[5094]: E0220 08:09:03.386809 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145\": container with ID starting with ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145 not found: ID does not exist" containerID="ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.386843 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145"} err="failed to get container status \"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145\": rpc error: code = NotFound desc = could not find container \"ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145\": container with ID starting with ff0214f2a987535c275156e8107b886fcb2dde4a06f589fb68a02f6924af7145 not found: ID does not exist" Feb 20 08:09:03 crc kubenswrapper[5094]: I0220 08:09:03.848660 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" path="/var/lib/kubelet/pods/d7113bba-5e76-4cec-86ae-8b8f25962f3b/volumes" Feb 20 08:09:13 crc kubenswrapper[5094]: I0220 08:09:13.840312 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:13 crc kubenswrapper[5094]: E0220 08:09:13.841207 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:09:27 crc kubenswrapper[5094]: I0220 08:09:27.841373 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:27 crc kubenswrapper[5094]: E0220 08:09:27.842356 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:09:41 crc kubenswrapper[5094]: I0220 08:09:41.841654 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:41 crc kubenswrapper[5094]: E0220 08:09:41.842982 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:09:52 crc kubenswrapper[5094]: I0220 08:09:52.840479 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:09:52 crc kubenswrapper[5094]: E0220 08:09:52.842261 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:10:07 crc kubenswrapper[5094]: I0220 08:10:07.840313 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:10:07 crc kubenswrapper[5094]: E0220 08:10:07.842519 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:10:22 crc kubenswrapper[5094]: I0220 08:10:22.842316 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:10:22 crc kubenswrapper[5094]: E0220 08:10:22.843666 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:10:37 crc kubenswrapper[5094]: I0220 08:10:37.840796 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:10:37 crc kubenswrapper[5094]: E0220 08:10:37.841877 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:10:49 crc kubenswrapper[5094]: I0220 08:10:49.840389 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:10:49 crc kubenswrapper[5094]: E0220 08:10:49.841870 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:01 crc kubenswrapper[5094]: I0220 08:11:01.840856 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:01 crc kubenswrapper[5094]: E0220 08:11:01.841676 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:14 crc kubenswrapper[5094]: I0220 08:11:14.840873 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:14 crc kubenswrapper[5094]: E0220 08:11:14.842003 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:25 crc kubenswrapper[5094]: I0220 08:11:25.844419 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:25 crc kubenswrapper[5094]: E0220 08:11:25.845846 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:38 crc kubenswrapper[5094]: I0220 08:11:38.841629 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:38 crc kubenswrapper[5094]: E0220 08:11:38.842686 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:11:52 crc kubenswrapper[5094]: I0220 08:11:52.840696 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:11:52 crc kubenswrapper[5094]: E0220 08:11:52.841308 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:06 crc kubenswrapper[5094]: I0220 08:12:06.840666 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:06 crc kubenswrapper[5094]: E0220 08:12:06.842035 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:17 crc kubenswrapper[5094]: I0220 08:12:17.840657 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:17 crc kubenswrapper[5094]: E0220 08:12:17.841652 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:29 crc kubenswrapper[5094]: I0220 08:12:29.840079 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:29 crc kubenswrapper[5094]: E0220 08:12:29.841692 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:41 crc kubenswrapper[5094]: I0220 08:12:41.839797 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:41 crc kubenswrapper[5094]: E0220 08:12:41.840374 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:12:54 crc kubenswrapper[5094]: I0220 08:12:54.840628 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:12:54 crc kubenswrapper[5094]: E0220 08:12:54.841365 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:13:05 crc kubenswrapper[5094]: I0220 08:13:05.848157 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:13:05 crc kubenswrapper[5094]: E0220 08:13:05.849066 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:13:16 crc kubenswrapper[5094]: I0220 08:13:16.841038 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:13:16 crc kubenswrapper[5094]: E0220 08:13:16.841855 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:13:27 crc kubenswrapper[5094]: I0220 08:13:27.841090 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:13:27 crc kubenswrapper[5094]: E0220 08:13:27.842443 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:13:41 crc kubenswrapper[5094]: I0220 08:13:41.840911 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:13:42 crc kubenswrapper[5094]: I0220 08:13:42.416503 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca"} Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.158222 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 08:15:00 crc kubenswrapper[5094]: E0220 08:15:00.159541 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.159569 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" Feb 20 08:15:00 crc kubenswrapper[5094]: E0220 08:15:00.159597 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="extract-content" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.159611 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="extract-content" Feb 20 08:15:00 crc kubenswrapper[5094]: E0220 08:15:00.159628 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="extract-utilities" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.159645 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="extract-utilities" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.161479 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7113bba-5e76-4cec-86ae-8b8f25962f3b" containerName="registry-server" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.163174 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.167546 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.168006 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.195118 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.338971 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.339023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.339130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.440041 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.440103 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.440185 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.441221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.455072 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.459325 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") pod \"collect-profiles-29526255-vpwzr\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.497204 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:00 crc kubenswrapper[5094]: I0220 08:15:00.889715 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 08:15:01 crc kubenswrapper[5094]: I0220 08:15:01.047739 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" event={"ID":"0b1b88d4-fc9b-465d-907e-7abf6c46c919","Type":"ContainerStarted","Data":"df01effbb937965fa2da8671ac9d455b13bf8b90f5ca8dcd5de48223dc6408d9"} Feb 20 08:15:01 crc kubenswrapper[5094]: I0220 08:15:01.047800 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" event={"ID":"0b1b88d4-fc9b-465d-907e-7abf6c46c919","Type":"ContainerStarted","Data":"e6dce8367cd274fa2f1bf6b1f243c57fa20df7ec4137f484c2b7028850b4e915"} Feb 20 08:15:01 crc kubenswrapper[5094]: I0220 08:15:01.066035 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" podStartSLOduration=1.065964497 podStartE2EDuration="1.065964497s" podCreationTimestamp="2026-02-20 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:15:01.062649327 +0000 UTC m=+5315.935276058" watchObservedRunningTime="2026-02-20 08:15:01.065964497 +0000 UTC m=+5315.938591218" Feb 20 08:15:02 crc kubenswrapper[5094]: I0220 08:15:02.055959 5094 generic.go:334] "Generic (PLEG): container finished" podID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" containerID="df01effbb937965fa2da8671ac9d455b13bf8b90f5ca8dcd5de48223dc6408d9" exitCode=0 Feb 20 08:15:02 crc kubenswrapper[5094]: I0220 08:15:02.056073 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" event={"ID":"0b1b88d4-fc9b-465d-907e-7abf6c46c919","Type":"ContainerDied","Data":"df01effbb937965fa2da8671ac9d455b13bf8b90f5ca8dcd5de48223dc6408d9"} Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.358779 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.389694 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") pod \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.389807 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") pod \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.389872 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") pod \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\" (UID: \"0b1b88d4-fc9b-465d-907e-7abf6c46c919\") " Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.394283 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume" (OuterVolumeSpecName: "config-volume") pod "0b1b88d4-fc9b-465d-907e-7abf6c46c919" (UID: "0b1b88d4-fc9b-465d-907e-7abf6c46c919"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.399005 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0b1b88d4-fc9b-465d-907e-7abf6c46c919" (UID: "0b1b88d4-fc9b-465d-907e-7abf6c46c919"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.399080 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx" (OuterVolumeSpecName: "kube-api-access-hj6hx") pod "0b1b88d4-fc9b-465d-907e-7abf6c46c919" (UID: "0b1b88d4-fc9b-465d-907e-7abf6c46c919"). InnerVolumeSpecName "kube-api-access-hj6hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.491496 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b1b88d4-fc9b-465d-907e-7abf6c46c919-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.491533 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0b1b88d4-fc9b-465d-907e-7abf6c46c919-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:15:03 crc kubenswrapper[5094]: I0220 08:15:03.491543 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj6hx\" (UniqueName: \"kubernetes.io/projected/0b1b88d4-fc9b-465d-907e-7abf6c46c919-kube-api-access-hj6hx\") on node \"crc\" DevicePath \"\"" Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.073345 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" event={"ID":"0b1b88d4-fc9b-465d-907e-7abf6c46c919","Type":"ContainerDied","Data":"e6dce8367cd274fa2f1bf6b1f243c57fa20df7ec4137f484c2b7028850b4e915"} Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.073393 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6dce8367cd274fa2f1bf6b1f243c57fa20df7ec4137f484c2b7028850b4e915" Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.073409 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr" Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.437276 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 08:15:04 crc kubenswrapper[5094]: I0220 08:15:04.441799 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526210-hwdfw"] Feb 20 08:15:05 crc kubenswrapper[5094]: I0220 08:15:05.853933 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d" path="/var/lib/kubelet/pods/e67c1ec2-44ec-4c4d-a3bf-8bc8ecc4460d/volumes" Feb 20 08:15:36 crc kubenswrapper[5094]: I0220 08:15:36.049262 5094 scope.go:117] "RemoveContainer" containerID="07ea8e807e5436859467c750ef51269844eba966788ab09e687b71868fdd8b31" Feb 20 08:16:04 crc kubenswrapper[5094]: I0220 08:16:04.107405 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:16:04 crc kubenswrapper[5094]: I0220 08:16:04.108088 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.067457 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:12 crc kubenswrapper[5094]: E0220 08:16:12.068413 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" containerName="collect-profiles" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.068432 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" containerName="collect-profiles" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.068640 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" containerName="collect-profiles" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.069881 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.085092 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.265102 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.265493 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.265638 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.366774 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.366842 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.366865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.367282 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.367384 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.390633 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") pod \"community-operators-szndh\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:12 crc kubenswrapper[5094]: I0220 08:16:12.688073 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.143474 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.604434 5094 generic.go:334] "Generic (PLEG): container finished" podID="49693624-23e2-4579-a736-a6148ac00de5" containerID="5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89" exitCode=0 Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.604664 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerDied","Data":"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89"} Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.604690 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerStarted","Data":"77800da7e8ad36fe3403f66f675a7ce8b8ae989d7a8a45320926ea3f87f9b807"} Feb 20 08:16:13 crc kubenswrapper[5094]: I0220 08:16:13.607427 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:16:14 crc kubenswrapper[5094]: I0220 08:16:14.620606 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerStarted","Data":"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111"} Feb 20 08:16:15 crc kubenswrapper[5094]: I0220 08:16:15.633376 5094 generic.go:334] "Generic (PLEG): container finished" podID="49693624-23e2-4579-a736-a6148ac00de5" containerID="9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111" exitCode=0 Feb 20 08:16:15 crc kubenswrapper[5094]: I0220 08:16:15.633446 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerDied","Data":"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111"} Feb 20 08:16:16 crc kubenswrapper[5094]: I0220 08:16:16.642524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerStarted","Data":"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3"} Feb 20 08:16:16 crc kubenswrapper[5094]: I0220 08:16:16.669173 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-szndh" podStartSLOduration=2.217313269 podStartE2EDuration="4.66915261s" podCreationTimestamp="2026-02-20 08:16:12 +0000 UTC" firstStartedPulling="2026-02-20 08:16:13.607243041 +0000 UTC m=+5388.479869752" lastFinishedPulling="2026-02-20 08:16:16.059082382 +0000 UTC m=+5390.931709093" observedRunningTime="2026-02-20 08:16:16.662174583 +0000 UTC m=+5391.534801294" watchObservedRunningTime="2026-02-20 08:16:16.66915261 +0000 UTC m=+5391.541779321" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.490488 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.492778 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.508806 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.686021 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.686507 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.686645 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788086 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788190 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788309 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.788679 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:19 crc kubenswrapper[5094]: I0220 08:16:19.824174 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") pod \"redhat-marketplace-2792j\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.123648 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.397193 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:20 crc kubenswrapper[5094]: W0220 08:16:20.399360 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9a2484_7e04_40b5_aae9_895f1a450ad6.slice/crio-c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128 WatchSource:0}: Error finding container c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128: Status 404 returned error can't find the container with id c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128 Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.669399 5094 generic.go:334] "Generic (PLEG): container finished" podID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerID="279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5" exitCode=0 Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.669504 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerDied","Data":"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5"} Feb 20 08:16:20 crc kubenswrapper[5094]: I0220 08:16:20.669778 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerStarted","Data":"c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128"} Feb 20 08:16:21 crc kubenswrapper[5094]: I0220 08:16:21.680046 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerStarted","Data":"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41"} Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.688163 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.688422 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.690972 5094 generic.go:334] "Generic (PLEG): container finished" podID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerID="ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41" exitCode=0 Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.691024 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerDied","Data":"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41"} Feb 20 08:16:22 crc kubenswrapper[5094]: I0220 08:16:22.772843 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:23 crc kubenswrapper[5094]: I0220 08:16:23.702653 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerStarted","Data":"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737"} Feb 20 08:16:23 crc kubenswrapper[5094]: I0220 08:16:23.728191 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2792j" podStartSLOduration=2.3216876859999998 podStartE2EDuration="4.728166227s" podCreationTimestamp="2026-02-20 08:16:19 +0000 UTC" firstStartedPulling="2026-02-20 08:16:20.672063378 +0000 UTC m=+5395.544690089" lastFinishedPulling="2026-02-20 08:16:23.078541909 +0000 UTC m=+5397.951168630" observedRunningTime="2026-02-20 08:16:23.722936122 +0000 UTC m=+5398.595562873" watchObservedRunningTime="2026-02-20 08:16:23.728166227 +0000 UTC m=+5398.600792948" Feb 20 08:16:23 crc kubenswrapper[5094]: I0220 08:16:23.766555 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:25 crc kubenswrapper[5094]: I0220 08:16:25.039943 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:26 crc kubenswrapper[5094]: I0220 08:16:26.732485 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-szndh" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="registry-server" containerID="cri-o://a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" gracePeriod=2 Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.188779 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.328401 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") pod \"49693624-23e2-4579-a736-a6148ac00de5\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.328751 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") pod \"49693624-23e2-4579-a736-a6148ac00de5\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.328951 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") pod \"49693624-23e2-4579-a736-a6148ac00de5\" (UID: \"49693624-23e2-4579-a736-a6148ac00de5\") " Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.329685 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities" (OuterVolumeSpecName: "utilities") pod "49693624-23e2-4579-a736-a6148ac00de5" (UID: "49693624-23e2-4579-a736-a6148ac00de5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.337961 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm" (OuterVolumeSpecName: "kube-api-access-r7mcm") pod "49693624-23e2-4579-a736-a6148ac00de5" (UID: "49693624-23e2-4579-a736-a6148ac00de5"). InnerVolumeSpecName "kube-api-access-r7mcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.386010 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49693624-23e2-4579-a736-a6148ac00de5" (UID: "49693624-23e2-4579-a736-a6148ac00de5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.430968 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.431008 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7mcm\" (UniqueName: \"kubernetes.io/projected/49693624-23e2-4579-a736-a6148ac00de5-kube-api-access-r7mcm\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.431020 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49693624-23e2-4579-a736-a6148ac00de5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750042 5094 generic.go:334] "Generic (PLEG): container finished" podID="49693624-23e2-4579-a736-a6148ac00de5" containerID="a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" exitCode=0 Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750122 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerDied","Data":"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3"} Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750149 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szndh" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szndh" event={"ID":"49693624-23e2-4579-a736-a6148ac00de5","Type":"ContainerDied","Data":"77800da7e8ad36fe3403f66f675a7ce8b8ae989d7a8a45320926ea3f87f9b807"} Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.750211 5094 scope.go:117] "RemoveContainer" containerID="a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.789315 5094 scope.go:117] "RemoveContainer" containerID="9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.795216 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.800528 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-szndh"] Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.813277 5094 scope.go:117] "RemoveContainer" containerID="5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.833230 5094 scope.go:117] "RemoveContainer" containerID="a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" Feb 20 08:16:27 crc kubenswrapper[5094]: E0220 08:16:27.833723 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3\": container with ID starting with a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3 not found: ID does not exist" containerID="a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.833773 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3"} err="failed to get container status \"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3\": rpc error: code = NotFound desc = could not find container \"a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3\": container with ID starting with a332eb9c1cd6b287b06b452672f79b317ec070e1819028f5f277db0a8d2ddaa3 not found: ID does not exist" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.833821 5094 scope.go:117] "RemoveContainer" containerID="9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111" Feb 20 08:16:27 crc kubenswrapper[5094]: E0220 08:16:27.834203 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111\": container with ID starting with 9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111 not found: ID does not exist" containerID="9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.834241 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111"} err="failed to get container status \"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111\": rpc error: code = NotFound desc = could not find container \"9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111\": container with ID starting with 9a98987ba55a83e82377e685996648b387e22f999290e5a3fb53a68cd8615111 not found: ID does not exist" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.834266 5094 scope.go:117] "RemoveContainer" containerID="5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89" Feb 20 08:16:27 crc kubenswrapper[5094]: E0220 08:16:27.834590 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89\": container with ID starting with 5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89 not found: ID does not exist" containerID="5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.834625 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89"} err="failed to get container status \"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89\": rpc error: code = NotFound desc = could not find container \"5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89\": container with ID starting with 5dda47894fb4b237543c3e5a0a710cb4e499545a2916d431ecc2e488db613d89 not found: ID does not exist" Feb 20 08:16:27 crc kubenswrapper[5094]: I0220 08:16:27.849583 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49693624-23e2-4579-a736-a6148ac00de5" path="/var/lib/kubelet/pods/49693624-23e2-4579-a736-a6148ac00de5/volumes" Feb 20 08:16:30 crc kubenswrapper[5094]: I0220 08:16:30.124937 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:30 crc kubenswrapper[5094]: I0220 08:16:30.125065 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:30 crc kubenswrapper[5094]: I0220 08:16:30.166550 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:30 crc kubenswrapper[5094]: I0220 08:16:30.835474 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:31 crc kubenswrapper[5094]: I0220 08:16:31.043743 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:32 crc kubenswrapper[5094]: I0220 08:16:32.792858 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2792j" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="registry-server" containerID="cri-o://09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" gracePeriod=2 Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.248819 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.426529 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") pod \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.426613 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") pod \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.426651 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") pod \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\" (UID: \"6b9a2484-7e04-40b5-aae9-895f1a450ad6\") " Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.427798 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities" (OuterVolumeSpecName: "utilities") pod "6b9a2484-7e04-40b5-aae9-895f1a450ad6" (UID: "6b9a2484-7e04-40b5-aae9-895f1a450ad6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.433318 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb" (OuterVolumeSpecName: "kube-api-access-x2jxb") pod "6b9a2484-7e04-40b5-aae9-895f1a450ad6" (UID: "6b9a2484-7e04-40b5-aae9-895f1a450ad6"). InnerVolumeSpecName "kube-api-access-x2jxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.457831 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b9a2484-7e04-40b5-aae9-895f1a450ad6" (UID: "6b9a2484-7e04-40b5-aae9-895f1a450ad6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.528787 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2jxb\" (UniqueName: \"kubernetes.io/projected/6b9a2484-7e04-40b5-aae9-895f1a450ad6-kube-api-access-x2jxb\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.528830 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.528840 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b9a2484-7e04-40b5-aae9-895f1a450ad6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.804605 5094 generic.go:334] "Generic (PLEG): container finished" podID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerID="09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" exitCode=0 Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.804639 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2792j" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.804655 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerDied","Data":"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737"} Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.805120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2792j" event={"ID":"6b9a2484-7e04-40b5-aae9-895f1a450ad6","Type":"ContainerDied","Data":"c44365c565049838dd503b609110293ae986c0140dec2e02acbc4c804bc33128"} Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.805142 5094 scope.go:117] "RemoveContainer" containerID="09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.825607 5094 scope.go:117] "RemoveContainer" containerID="ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.868834 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.868891 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2792j"] Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.872831 5094 scope.go:117] "RemoveContainer" containerID="279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.890218 5094 scope.go:117] "RemoveContainer" containerID="09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" Feb 20 08:16:33 crc kubenswrapper[5094]: E0220 08:16:33.890862 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737\": container with ID starting with 09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737 not found: ID does not exist" containerID="09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.890908 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737"} err="failed to get container status \"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737\": rpc error: code = NotFound desc = could not find container \"09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737\": container with ID starting with 09d4e876a4b0323d6acc2f616735e6cf048edc258f42476d7cbf3371b7b69737 not found: ID does not exist" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.890936 5094 scope.go:117] "RemoveContainer" containerID="ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41" Feb 20 08:16:33 crc kubenswrapper[5094]: E0220 08:16:33.891360 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41\": container with ID starting with ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41 not found: ID does not exist" containerID="ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.891469 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41"} err="failed to get container status \"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41\": rpc error: code = NotFound desc = could not find container \"ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41\": container with ID starting with ed5402904916dc6afe4f8848a945c7421e6f0b70582de7500a05cdb6fc590e41 not found: ID does not exist" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.891568 5094 scope.go:117] "RemoveContainer" containerID="279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5" Feb 20 08:16:33 crc kubenswrapper[5094]: E0220 08:16:33.892087 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5\": container with ID starting with 279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5 not found: ID does not exist" containerID="279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5" Feb 20 08:16:33 crc kubenswrapper[5094]: I0220 08:16:33.892114 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5"} err="failed to get container status \"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5\": rpc error: code = NotFound desc = could not find container \"279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5\": container with ID starting with 279190da08b55a4cc036cbd0439b757ac2fbf4be0789e671813e3757e0274ed5 not found: ID does not exist" Feb 20 08:16:34 crc kubenswrapper[5094]: I0220 08:16:34.107216 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:16:34 crc kubenswrapper[5094]: I0220 08:16:34.107636 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:16:35 crc kubenswrapper[5094]: I0220 08:16:35.858264 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" path="/var/lib/kubelet/pods/6b9a2484-7e04-40b5-aae9-895f1a450ad6/volumes" Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.106849 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.107518 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.107691 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.108389 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:17:04 crc kubenswrapper[5094]: I0220 08:17:04.108457 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca" gracePeriod=600 Feb 20 08:17:05 crc kubenswrapper[5094]: I0220 08:17:05.063696 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca" exitCode=0 Feb 20 08:17:05 crc kubenswrapper[5094]: I0220 08:17:05.063799 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca"} Feb 20 08:17:05 crc kubenswrapper[5094]: I0220 08:17:05.064274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda"} Feb 20 08:17:05 crc kubenswrapper[5094]: I0220 08:17:05.064300 5094 scope.go:117] "RemoveContainer" containerID="ad5e17030872a06d07762c5b75de41275371eaf6dafd06c037aa302c6b413f37" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.838477 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839427 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839443 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839466 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="extract-utilities" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839475 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="extract-utilities" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839486 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="extract-content" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839496 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="extract-content" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839511 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="extract-utilities" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839520 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="extract-utilities" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839541 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="extract-content" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839550 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="extract-content" Feb 20 08:17:29 crc kubenswrapper[5094]: E0220 08:17:29.839567 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839575 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839776 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9a2484-7e04-40b5-aae9-895f1a450ad6" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.839798 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="49693624-23e2-4579-a736-a6148ac00de5" containerName="registry-server" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.841054 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:29 crc kubenswrapper[5094]: I0220 08:17:29.879923 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.026746 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.026818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.026891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.127879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.127924 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.127961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.128493 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.128575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.146730 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") pod \"certified-operators-jxdj4\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.192599 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:30 crc kubenswrapper[5094]: I0220 08:17:30.480212 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:31 crc kubenswrapper[5094]: I0220 08:17:31.319915 5094 generic.go:334] "Generic (PLEG): container finished" podID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerID="ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771" exitCode=0 Feb 20 08:17:31 crc kubenswrapper[5094]: I0220 08:17:31.319955 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerDied","Data":"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771"} Feb 20 08:17:31 crc kubenswrapper[5094]: I0220 08:17:31.319979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerStarted","Data":"487dc11d31f6ce82e7ae2d892e6eff95fc315f0bc93c3d1894f98a07ce7a3066"} Feb 20 08:17:32 crc kubenswrapper[5094]: I0220 08:17:32.328675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerStarted","Data":"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076"} Feb 20 08:17:33 crc kubenswrapper[5094]: I0220 08:17:33.342248 5094 generic.go:334] "Generic (PLEG): container finished" podID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerID="d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076" exitCode=0 Feb 20 08:17:33 crc kubenswrapper[5094]: I0220 08:17:33.342320 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerDied","Data":"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076"} Feb 20 08:17:34 crc kubenswrapper[5094]: I0220 08:17:34.354338 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerStarted","Data":"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb"} Feb 20 08:17:34 crc kubenswrapper[5094]: I0220 08:17:34.382744 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jxdj4" podStartSLOduration=2.958839541 podStartE2EDuration="5.38268603s" podCreationTimestamp="2026-02-20 08:17:29 +0000 UTC" firstStartedPulling="2026-02-20 08:17:31.321361204 +0000 UTC m=+5466.193987915" lastFinishedPulling="2026-02-20 08:17:33.745207653 +0000 UTC m=+5468.617834404" observedRunningTime="2026-02-20 08:17:34.37561888 +0000 UTC m=+5469.248245631" watchObservedRunningTime="2026-02-20 08:17:34.38268603 +0000 UTC m=+5469.255312781" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.193531 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.194497 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.272551 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.493658 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:40 crc kubenswrapper[5094]: I0220 08:17:40.549153 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.421317 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jxdj4" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="registry-server" containerID="cri-o://8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" gracePeriod=2 Feb 20 08:17:42 crc kubenswrapper[5094]: E0220 08:17:42.537352 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64f8c64c_9964_45bc_a6f5_b588e04962e1.slice/crio-8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb.scope\": RecentStats: unable to find data in memory cache]" Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.905803 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.930283 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") pod \"64f8c64c-9964-45bc-a6f5-b588e04962e1\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.930370 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") pod \"64f8c64c-9964-45bc-a6f5-b588e04962e1\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.930406 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") pod \"64f8c64c-9964-45bc-a6f5-b588e04962e1\" (UID: \"64f8c64c-9964-45bc-a6f5-b588e04962e1\") " Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.932286 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities" (OuterVolumeSpecName: "utilities") pod "64f8c64c-9964-45bc-a6f5-b588e04962e1" (UID: "64f8c64c-9964-45bc-a6f5-b588e04962e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:17:42 crc kubenswrapper[5094]: I0220 08:17:42.942408 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb" (OuterVolumeSpecName: "kube-api-access-xfsrb") pod "64f8c64c-9964-45bc-a6f5-b588e04962e1" (UID: "64f8c64c-9964-45bc-a6f5-b588e04962e1"). InnerVolumeSpecName "kube-api-access-xfsrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.031812 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.031864 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsrb\" (UniqueName: \"kubernetes.io/projected/64f8c64c-9964-45bc-a6f5-b588e04962e1-kube-api-access-xfsrb\") on node \"crc\" DevicePath \"\"" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438787 5094 generic.go:334] "Generic (PLEG): container finished" podID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerID="8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" exitCode=0 Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438829 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerDied","Data":"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb"} Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438811 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jxdj4" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438871 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jxdj4" event={"ID":"64f8c64c-9964-45bc-a6f5-b588e04962e1","Type":"ContainerDied","Data":"487dc11d31f6ce82e7ae2d892e6eff95fc315f0bc93c3d1894f98a07ce7a3066"} Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.438891 5094 scope.go:117] "RemoveContainer" containerID="8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.480539 5094 scope.go:117] "RemoveContainer" containerID="d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.499308 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "64f8c64c-9964-45bc-a6f5-b588e04962e1" (UID: "64f8c64c-9964-45bc-a6f5-b588e04962e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.508459 5094 scope.go:117] "RemoveContainer" containerID="ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.539641 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/64f8c64c-9964-45bc-a6f5-b588e04962e1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.553298 5094 scope.go:117] "RemoveContainer" containerID="8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" Feb 20 08:17:43 crc kubenswrapper[5094]: E0220 08:17:43.553978 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb\": container with ID starting with 8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb not found: ID does not exist" containerID="8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.554021 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb"} err="failed to get container status \"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb\": rpc error: code = NotFound desc = could not find container \"8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb\": container with ID starting with 8fa1b2b785229a8ca47d1d3cbbad1359e5364208bd831802e973c24ee5e70bbb not found: ID does not exist" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.554048 5094 scope.go:117] "RemoveContainer" containerID="d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076" Feb 20 08:17:43 crc kubenswrapper[5094]: E0220 08:17:43.554647 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076\": container with ID starting with d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076 not found: ID does not exist" containerID="d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.554686 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076"} err="failed to get container status \"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076\": rpc error: code = NotFound desc = could not find container \"d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076\": container with ID starting with d702aa8f9bd4af18ff756018b88b1c6acbb6b891749b37833177f16791ffd076 not found: ID does not exist" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.554736 5094 scope.go:117] "RemoveContainer" containerID="ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771" Feb 20 08:17:43 crc kubenswrapper[5094]: E0220 08:17:43.555084 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771\": container with ID starting with ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771 not found: ID does not exist" containerID="ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.555126 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771"} err="failed to get container status \"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771\": rpc error: code = NotFound desc = could not find container \"ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771\": container with ID starting with ae2104091bc469a43dc0c9748f7fbc6439f754633daaafdf4ea0867dbc748771 not found: ID does not exist" Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.789067 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.793558 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jxdj4"] Feb 20 08:17:43 crc kubenswrapper[5094]: I0220 08:17:43.855153 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" path="/var/lib/kubelet/pods/64f8c64c-9964-45bc-a6f5-b588e04962e1/volumes" Feb 20 08:19:04 crc kubenswrapper[5094]: I0220 08:19:04.106760 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:19:04 crc kubenswrapper[5094]: I0220 08:19:04.107378 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.527106 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:11 crc kubenswrapper[5094]: E0220 08:19:11.528256 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="registry-server" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.528270 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="registry-server" Feb 20 08:19:11 crc kubenswrapper[5094]: E0220 08:19:11.528345 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="extract-utilities" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.528356 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="extract-utilities" Feb 20 08:19:11 crc kubenswrapper[5094]: E0220 08:19:11.528367 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="extract-content" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.528375 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="extract-content" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.528514 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="64f8c64c-9964-45bc-a6f5-b588e04962e1" containerName="registry-server" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.529650 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.550683 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.704971 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.705226 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.705360 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807142 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807221 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807251 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807874 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.807917 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.834508 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") pod \"redhat-operators-b5nqx\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:11 crc kubenswrapper[5094]: I0220 08:19:11.852775 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:12 crc kubenswrapper[5094]: I0220 08:19:12.348555 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:13 crc kubenswrapper[5094]: I0220 08:19:13.253501 5094 generic.go:334] "Generic (PLEG): container finished" podID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerID="ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad" exitCode=0 Feb 20 08:19:13 crc kubenswrapper[5094]: I0220 08:19:13.253540 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerDied","Data":"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad"} Feb 20 08:19:13 crc kubenswrapper[5094]: I0220 08:19:13.253564 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerStarted","Data":"92e6948396454f04f599a6213d5638ad59001d7ab5a53aa303251ddda9de1c5a"} Feb 20 08:19:14 crc kubenswrapper[5094]: I0220 08:19:14.267841 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerStarted","Data":"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c"} Feb 20 08:19:15 crc kubenswrapper[5094]: I0220 08:19:15.279029 5094 generic.go:334] "Generic (PLEG): container finished" podID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerID="260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c" exitCode=0 Feb 20 08:19:15 crc kubenswrapper[5094]: I0220 08:19:15.279118 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerDied","Data":"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c"} Feb 20 08:19:16 crc kubenswrapper[5094]: I0220 08:19:16.292160 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerStarted","Data":"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e"} Feb 20 08:19:16 crc kubenswrapper[5094]: I0220 08:19:16.327903 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b5nqx" podStartSLOduration=2.814928297 podStartE2EDuration="5.32788174s" podCreationTimestamp="2026-02-20 08:19:11 +0000 UTC" firstStartedPulling="2026-02-20 08:19:13.25523572 +0000 UTC m=+5568.127862431" lastFinishedPulling="2026-02-20 08:19:15.768189133 +0000 UTC m=+5570.640815874" observedRunningTime="2026-02-20 08:19:16.32622393 +0000 UTC m=+5571.198850641" watchObservedRunningTime="2026-02-20 08:19:16.32788174 +0000 UTC m=+5571.200508531" Feb 20 08:19:21 crc kubenswrapper[5094]: I0220 08:19:21.853777 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:21 crc kubenswrapper[5094]: I0220 08:19:21.854302 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:22 crc kubenswrapper[5094]: I0220 08:19:22.914311 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b5nqx" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" probeResult="failure" output=< Feb 20 08:19:22 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:19:22 crc kubenswrapper[5094]: > Feb 20 08:19:31 crc kubenswrapper[5094]: I0220 08:19:31.905267 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:31 crc kubenswrapper[5094]: I0220 08:19:31.953606 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:32 crc kubenswrapper[5094]: I0220 08:19:32.141561 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.446515 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b5nqx" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" containerID="cri-o://abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" gracePeriod=2 Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.818421 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.938069 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") pod \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.938357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") pod \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.938430 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") pod \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\" (UID: \"a4b6568c-9fe2-4353-835a-d363b4e64f9b\") " Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.939199 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities" (OuterVolumeSpecName: "utilities") pod "a4b6568c-9fe2-4353-835a-d363b4e64f9b" (UID: "a4b6568c-9fe2-4353-835a-d363b4e64f9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:19:33 crc kubenswrapper[5094]: I0220 08:19:33.955103 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9" (OuterVolumeSpecName: "kube-api-access-jt7h9") pod "a4b6568c-9fe2-4353-835a-d363b4e64f9b" (UID: "a4b6568c-9fe2-4353-835a-d363b4e64f9b"). InnerVolumeSpecName "kube-api-access-jt7h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.039415 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.039452 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7h9\" (UniqueName: \"kubernetes.io/projected/a4b6568c-9fe2-4353-835a-d363b4e64f9b-kube-api-access-jt7h9\") on node \"crc\" DevicePath \"\"" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.054933 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4b6568c-9fe2-4353-835a-d363b4e64f9b" (UID: "a4b6568c-9fe2-4353-835a-d363b4e64f9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.106324 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.106592 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.140411 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4b6568c-9fe2-4353-835a-d363b4e64f9b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.454992 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b5nqx" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.455004 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerDied","Data":"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e"} Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.455128 5094 scope.go:117] "RemoveContainer" containerID="abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.456996 5094 generic.go:334] "Generic (PLEG): container finished" podID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerID="abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" exitCode=0 Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.457050 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b5nqx" event={"ID":"a4b6568c-9fe2-4353-835a-d363b4e64f9b","Type":"ContainerDied","Data":"92e6948396454f04f599a6213d5638ad59001d7ab5a53aa303251ddda9de1c5a"} Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.483082 5094 scope.go:117] "RemoveContainer" containerID="260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.497764 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.497843 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b5nqx"] Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.504144 5094 scope.go:117] "RemoveContainer" containerID="ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.523875 5094 scope.go:117] "RemoveContainer" containerID="abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" Feb 20 08:19:34 crc kubenswrapper[5094]: E0220 08:19:34.524260 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e\": container with ID starting with abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e not found: ID does not exist" containerID="abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.524307 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e"} err="failed to get container status \"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e\": rpc error: code = NotFound desc = could not find container \"abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e\": container with ID starting with abc2d8411d9dbf7d76d25270021a13fa38e322bbf3b85ae52ffb695dcb09da4e not found: ID does not exist" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.524336 5094 scope.go:117] "RemoveContainer" containerID="260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c" Feb 20 08:19:34 crc kubenswrapper[5094]: E0220 08:19:34.524626 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c\": container with ID starting with 260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c not found: ID does not exist" containerID="260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.524650 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c"} err="failed to get container status \"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c\": rpc error: code = NotFound desc = could not find container \"260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c\": container with ID starting with 260e477dc84fa1fadafa46b30ccf4318d963a69512f32ad6e262b43c0e44157c not found: ID does not exist" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.524662 5094 scope.go:117] "RemoveContainer" containerID="ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad" Feb 20 08:19:34 crc kubenswrapper[5094]: E0220 08:19:34.525120 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad\": container with ID starting with ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad not found: ID does not exist" containerID="ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad" Feb 20 08:19:34 crc kubenswrapper[5094]: I0220 08:19:34.525144 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad"} err="failed to get container status \"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad\": rpc error: code = NotFound desc = could not find container \"ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad\": container with ID starting with ee643764c6a48859bea4db11f8b036d1720a13356831d97e22237ad24ea567ad not found: ID does not exist" Feb 20 08:19:35 crc kubenswrapper[5094]: I0220 08:19:35.858335 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" path="/var/lib/kubelet/pods/a4b6568c-9fe2-4353-835a-d363b4e64f9b/volumes" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.106669 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.107157 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.107192 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.107755 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.107803 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" gracePeriod=600 Feb 20 08:20:04 crc kubenswrapper[5094]: E0220 08:20:04.245164 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.733458 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" exitCode=0 Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.733502 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda"} Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.733533 5094 scope.go:117] "RemoveContainer" containerID="97148134be230a037117359d1f168e80e119726d339f329bd4df40629ef0a2ca" Feb 20 08:20:04 crc kubenswrapper[5094]: I0220 08:20:04.734040 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:04 crc kubenswrapper[5094]: E0220 08:20:04.734360 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:19 crc kubenswrapper[5094]: I0220 08:20:19.840978 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:19 crc kubenswrapper[5094]: E0220 08:20:19.842296 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:31 crc kubenswrapper[5094]: I0220 08:20:31.840876 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:31 crc kubenswrapper[5094]: E0220 08:20:31.841920 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:44 crc kubenswrapper[5094]: I0220 08:20:44.840481 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:44 crc kubenswrapper[5094]: E0220 08:20:44.841539 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:20:56 crc kubenswrapper[5094]: I0220 08:20:56.840294 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:20:56 crc kubenswrapper[5094]: E0220 08:20:56.842057 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:21:08 crc kubenswrapper[5094]: I0220 08:21:08.839796 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:21:08 crc kubenswrapper[5094]: E0220 08:21:08.840626 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:21:20 crc kubenswrapper[5094]: I0220 08:21:20.840550 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:21:20 crc kubenswrapper[5094]: E0220 08:21:20.841738 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:21:35 crc kubenswrapper[5094]: I0220 08:21:35.845555 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:21:35 crc kubenswrapper[5094]: E0220 08:21:35.846568 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:21:50 crc kubenswrapper[5094]: I0220 08:21:50.841226 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:21:50 crc kubenswrapper[5094]: E0220 08:21:50.842352 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.014820 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.022790 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-48gmd"] Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.174926 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:01 crc kubenswrapper[5094]: E0220 08:22:01.175398 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="extract-utilities" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.175424 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="extract-utilities" Feb 20 08:22:01 crc kubenswrapper[5094]: E0220 08:22:01.175453 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.175463 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" Feb 20 08:22:01 crc kubenswrapper[5094]: E0220 08:22:01.175478 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="extract-content" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.175489 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="extract-content" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.175666 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b6568c-9fe2-4353-835a-d363b4e64f9b" containerName="registry-server" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.176596 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.178611 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.178930 5094 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-c5nt4" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.179073 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.180160 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.190133 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.269697 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.269761 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.269784 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.371384 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.371429 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.371453 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.371769 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.373258 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.391987 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") pod \"crc-storage-crc-ksv5l\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.527739 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.842402 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:01 crc kubenswrapper[5094]: E0220 08:22:01.842806 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:01 crc kubenswrapper[5094]: I0220 08:22:01.853193 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8b4842-acdc-4e60-9de5-b7b6dde61b62" path="/var/lib/kubelet/pods/9d8b4842-acdc-4e60-9de5-b7b6dde61b62/volumes" Feb 20 08:22:02 crc kubenswrapper[5094]: I0220 08:22:02.094117 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:02 crc kubenswrapper[5094]: I0220 08:22:02.105173 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:22:02 crc kubenswrapper[5094]: I0220 08:22:02.789345 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ksv5l" event={"ID":"7a2bc18e-e8d4-445b-b8aa-34659fab0d46","Type":"ContainerStarted","Data":"25626209a2314dcb52916690377aafb551b2c0e2509d3cbf08a38364344d06cd"} Feb 20 08:22:03 crc kubenswrapper[5094]: I0220 08:22:03.798580 5094 generic.go:334] "Generic (PLEG): container finished" podID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" containerID="dc0e175dcf3ab875f0111e29b9804a9472d9627cddd8835f9c61529f34f1c8d3" exitCode=0 Feb 20 08:22:03 crc kubenswrapper[5094]: I0220 08:22:03.798648 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ksv5l" event={"ID":"7a2bc18e-e8d4-445b-b8aa-34659fab0d46","Type":"ContainerDied","Data":"dc0e175dcf3ab875f0111e29b9804a9472d9627cddd8835f9c61529f34f1c8d3"} Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.239697 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.336042 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") pod \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.336120 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") pod \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.336193 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") pod \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\" (UID: \"7a2bc18e-e8d4-445b-b8aa-34659fab0d46\") " Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.336597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "7a2bc18e-e8d4-445b-b8aa-34659fab0d46" (UID: "7a2bc18e-e8d4-445b-b8aa-34659fab0d46"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.342082 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg" (OuterVolumeSpecName: "kube-api-access-k5pzg") pod "7a2bc18e-e8d4-445b-b8aa-34659fab0d46" (UID: "7a2bc18e-e8d4-445b-b8aa-34659fab0d46"). InnerVolumeSpecName "kube-api-access-k5pzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.366617 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "7a2bc18e-e8d4-445b-b8aa-34659fab0d46" (UID: "7a2bc18e-e8d4-445b-b8aa-34659fab0d46"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.438507 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5pzg\" (UniqueName: \"kubernetes.io/projected/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-kube-api-access-k5pzg\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.438568 5094 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.438585 5094 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/7a2bc18e-e8d4-445b-b8aa-34659fab0d46-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.818579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ksv5l" event={"ID":"7a2bc18e-e8d4-445b-b8aa-34659fab0d46","Type":"ContainerDied","Data":"25626209a2314dcb52916690377aafb551b2c0e2509d3cbf08a38364344d06cd"} Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.818643 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25626209a2314dcb52916690377aafb551b2c0e2509d3cbf08a38364344d06cd" Feb 20 08:22:05 crc kubenswrapper[5094]: I0220 08:22:05.818766 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ksv5l" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.820314 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.830739 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-ksv5l"] Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.852450 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" path="/var/lib/kubelet/pods/7a2bc18e-e8d4-445b-b8aa-34659fab0d46/volumes" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.973773 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-flmdt"] Feb 20 08:22:07 crc kubenswrapper[5094]: E0220 08:22:07.974444 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" containerName="storage" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.974486 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" containerName="storage" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.974768 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a2bc18e-e8d4-445b-b8aa-34659fab0d46" containerName="storage" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.975913 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.979383 5094 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-c5nt4" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.979694 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.980004 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.980960 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.983434 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.983594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.983748 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:07 crc kubenswrapper[5094]: I0220 08:22:07.986440 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-flmdt"] Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.084378 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.084478 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.084553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.085033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.085948 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.109618 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") pod \"crc-storage-crc-flmdt\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.300072 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.815288 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-flmdt"] Feb 20 08:22:08 crc kubenswrapper[5094]: I0220 08:22:08.989264 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-flmdt" event={"ID":"aae9458b-5e2d-4930-b32f-ec957b766175","Type":"ContainerStarted","Data":"7303d3d4bf14a883133338e56cfe34c7a95e01e6132c8b6d39989cf5a695dc2e"} Feb 20 08:22:09 crc kubenswrapper[5094]: I0220 08:22:09.999404 5094 generic.go:334] "Generic (PLEG): container finished" podID="aae9458b-5e2d-4930-b32f-ec957b766175" containerID="d4b83d13fb6f1fea22d75b149f467c7b4670de433e70c72a73a2640f82e7276c" exitCode=0 Feb 20 08:22:09 crc kubenswrapper[5094]: I0220 08:22:09.999513 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-flmdt" event={"ID":"aae9458b-5e2d-4930-b32f-ec957b766175","Type":"ContainerDied","Data":"d4b83d13fb6f1fea22d75b149f467c7b4670de433e70c72a73a2640f82e7276c"} Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.434100 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.545891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") pod \"aae9458b-5e2d-4930-b32f-ec957b766175\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.546069 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") pod \"aae9458b-5e2d-4930-b32f-ec957b766175\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.546147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") pod \"aae9458b-5e2d-4930-b32f-ec957b766175\" (UID: \"aae9458b-5e2d-4930-b32f-ec957b766175\") " Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.546463 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "aae9458b-5e2d-4930-b32f-ec957b766175" (UID: "aae9458b-5e2d-4930-b32f-ec957b766175"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.552137 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx" (OuterVolumeSpecName: "kube-api-access-cb7dx") pod "aae9458b-5e2d-4930-b32f-ec957b766175" (UID: "aae9458b-5e2d-4930-b32f-ec957b766175"). InnerVolumeSpecName "kube-api-access-cb7dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.580879 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "aae9458b-5e2d-4930-b32f-ec957b766175" (UID: "aae9458b-5e2d-4930-b32f-ec957b766175"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.648306 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb7dx\" (UniqueName: \"kubernetes.io/projected/aae9458b-5e2d-4930-b32f-ec957b766175-kube-api-access-cb7dx\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.648355 5094 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/aae9458b-5e2d-4930-b32f-ec957b766175-node-mnt\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:11 crc kubenswrapper[5094]: I0220 08:22:11.648368 5094 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/aae9458b-5e2d-4930-b32f-ec957b766175-crc-storage\") on node \"crc\" DevicePath \"\"" Feb 20 08:22:12 crc kubenswrapper[5094]: I0220 08:22:12.023985 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-flmdt" event={"ID":"aae9458b-5e2d-4930-b32f-ec957b766175","Type":"ContainerDied","Data":"7303d3d4bf14a883133338e56cfe34c7a95e01e6132c8b6d39989cf5a695dc2e"} Feb 20 08:22:12 crc kubenswrapper[5094]: I0220 08:22:12.024026 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7303d3d4bf14a883133338e56cfe34c7a95e01e6132c8b6d39989cf5a695dc2e" Feb 20 08:22:12 crc kubenswrapper[5094]: I0220 08:22:12.024088 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-flmdt" Feb 20 08:22:15 crc kubenswrapper[5094]: I0220 08:22:15.847912 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:15 crc kubenswrapper[5094]: E0220 08:22:15.848751 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:26 crc kubenswrapper[5094]: I0220 08:22:26.839947 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:26 crc kubenswrapper[5094]: E0220 08:22:26.840767 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:36 crc kubenswrapper[5094]: I0220 08:22:36.257954 5094 scope.go:117] "RemoveContainer" containerID="07fcab491ccca10a02c6e686a0115bd8c0916121144d5fd12b7356bb88847cbf" Feb 20 08:22:38 crc kubenswrapper[5094]: I0220 08:22:38.840931 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:38 crc kubenswrapper[5094]: E0220 08:22:38.841579 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:22:52 crc kubenswrapper[5094]: I0220 08:22:52.841001 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:22:52 crc kubenswrapper[5094]: E0220 08:22:52.843396 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:23:06 crc kubenswrapper[5094]: I0220 08:23:06.840618 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:23:06 crc kubenswrapper[5094]: E0220 08:23:06.841742 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:23:20 crc kubenswrapper[5094]: I0220 08:23:20.840279 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:23:20 crc kubenswrapper[5094]: E0220 08:23:20.841534 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:23:35 crc kubenswrapper[5094]: I0220 08:23:35.848593 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:23:35 crc kubenswrapper[5094]: E0220 08:23:35.849628 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:23:50 crc kubenswrapper[5094]: I0220 08:23:50.840526 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:23:50 crc kubenswrapper[5094]: E0220 08:23:50.841621 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:03 crc kubenswrapper[5094]: I0220 08:24:03.839738 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:03 crc kubenswrapper[5094]: E0220 08:24:03.840569 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:14 crc kubenswrapper[5094]: I0220 08:24:14.840863 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:14 crc kubenswrapper[5094]: E0220 08:24:14.841758 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.278239 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:20 crc kubenswrapper[5094]: E0220 08:24:20.279049 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae9458b-5e2d-4930-b32f-ec957b766175" containerName="storage" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.279063 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae9458b-5e2d-4930-b32f-ec957b766175" containerName="storage" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.279191 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae9458b-5e2d-4930-b32f-ec957b766175" containerName="storage" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.279883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284066 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284599 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284732 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284855 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jb4sz" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.284975 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.324050 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.376721 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.376811 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.376865 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.477556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.477622 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.477650 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.478919 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.479466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.507210 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") pod \"dnsmasq-dns-bbf6488d7-47ktg\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.595045 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.644914 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.646309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.661602 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.689095 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.689212 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.689272 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.790490 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.790562 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.791996 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.792099 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.793173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:20 crc kubenswrapper[5094]: I0220 08:24:20.820025 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") pod \"dnsmasq-dns-76466d7cdc-nc2ng\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.046603 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.165746 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.464442 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.468570 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.471264 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.471323 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.471276 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2c4db" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.472757 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.472921 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.477556 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.504178 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616295 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616345 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616373 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616404 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616428 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616452 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616475 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616517 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.616532 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718493 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718559 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718586 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718623 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718653 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718726 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.718791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.719501 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.720068 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.721736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.721817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.722517 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.722562 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cc749f9eaf145df5c444fbae24383d1bdaa4331dffc2f7c6f1445ba7dce2304b/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.730172 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.731581 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.737103 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.737141 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.751668 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.803516 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.831740 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.833497 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836096 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836124 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836342 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836450 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rs7rx" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.836538 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 08:24:21 crc kubenswrapper[5094]: I0220 08:24:21.861453 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.023997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024052 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024079 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024170 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024231 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024285 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.024301 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125755 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125846 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125874 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125899 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125921 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125949 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.125998 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.126020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.126052 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.127149 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.127391 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.128448 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.129692 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.130368 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.130399 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c0ed1e37b492902e6baae6f722347d61d8d5759c03a6d0fd94fd84b63621ed84/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.132055 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.132360 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.134360 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.141632 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.158826 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.188034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerStarted","Data":"b8edfc60f02b3a08dfffd5b78149935d5da9920c6ad89968f0a47ed48c3497b0"} Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.189886 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerStarted","Data":"e19bae85da8585d71a99f124e607e047f6c285504074d9b173d9c3014d6e6d83"} Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.223691 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.288485 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: W0220 08:24:22.296192 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6404f29_e503_4f82_a2ce_e147c18677a7.slice/crio-4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018 WatchSource:0}: Error finding container 4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018: Status 404 returned error can't find the container with id 4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018 Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.314795 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.316283 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.323769 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cddr5" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.324047 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.325368 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.325668 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.327010 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.351086 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431343 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431445 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-kolla-config\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431583 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldvqr\" (UniqueName: \"kubernetes.io/projected/542d99bc-6049-42dc-9036-8a795552e896-kube-api-access-ldvqr\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431671 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-config-data-default\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431776 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/542d99bc-6049-42dc-9036-8a795552e896-config-data-generated\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431834 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.431967 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-operator-scripts\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.432084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.533864 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.533930 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.533955 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-kolla-config\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.533985 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldvqr\" (UniqueName: \"kubernetes.io/projected/542d99bc-6049-42dc-9036-8a795552e896-kube-api-access-ldvqr\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.534008 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-config-data-default\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.534043 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/542d99bc-6049-42dc-9036-8a795552e896-config-data-generated\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.534068 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.534108 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-operator-scripts\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.535596 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-operator-scripts\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.536119 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-kolla-config\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.536144 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/542d99bc-6049-42dc-9036-8a795552e896-config-data-generated\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.536587 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/542d99bc-6049-42dc-9036-8a795552e896-config-data-default\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.538178 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.538207 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0ba4264431e89fd45e3b88a4a6661a636cbc0affb8e1a965d3661fe6696cb9c8/globalmount\"" pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.539940 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.540418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/542d99bc-6049-42dc-9036-8a795552e896-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.566757 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldvqr\" (UniqueName: \"kubernetes.io/projected/542d99bc-6049-42dc-9036-8a795552e896-kube-api-access-ldvqr\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.579126 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2a44e57-3591-4378-a5ef-cac1c042d904\") pod \"openstack-galera-0\" (UID: \"542d99bc-6049-42dc-9036-8a795552e896\") " pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.653227 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.668854 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.939820 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.945944 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.950145 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-mgp6g" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.950925 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 20 08:24:22 crc kubenswrapper[5094]: I0220 08:24:22.958046 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.042150 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-config-data\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.043151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djqkm\" (UniqueName: \"kubernetes.io/projected/5074d037-240e-4685-8c3b-3dd7b963beb0-kube-api-access-djqkm\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.043270 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-kolla-config\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.069715 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.144523 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-config-data\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.144608 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djqkm\" (UniqueName: \"kubernetes.io/projected/5074d037-240e-4685-8c3b-3dd7b963beb0-kube-api-access-djqkm\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.144649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-kolla-config\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.145843 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-config-data\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.146613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5074d037-240e-4685-8c3b-3dd7b963beb0-kolla-config\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.163590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djqkm\" (UniqueName: \"kubernetes.io/projected/5074d037-240e-4685-8c3b-3dd7b963beb0-kube-api-access-djqkm\") pod \"memcached-0\" (UID: \"5074d037-240e-4685-8c3b-3dd7b963beb0\") " pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.215748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerStarted","Data":"4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018"} Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.221361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"542d99bc-6049-42dc-9036-8a795552e896","Type":"ContainerStarted","Data":"dcdb3b2cee1e238efa34b6ce00f898e563246cfe9a36783212511b2b820cb19f"} Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.223352 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerStarted","Data":"9e766a894fc47307f318a9f38ad47bbac110c5ff58c56284d0fae3825eea954c"} Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.279450 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 20 08:24:23 crc kubenswrapper[5094]: I0220 08:24:23.713020 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.106199 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.109641 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.112479 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.112721 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-c2zzx" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.113808 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.113858 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.114142 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.234033 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5074d037-240e-4685-8c3b-3dd7b963beb0","Type":"ContainerStarted","Data":"c1f694fa5035758e7ea5e4a26c7fadd55dd76c0890987ed3fac18d3998a072f1"} Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271010 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271109 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271167 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271234 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271301 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271350 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271441 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh64k\" (UniqueName: \"kubernetes.io/projected/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kube-api-access-rh64k\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.271594 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373462 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373527 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373565 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373611 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh64k\" (UniqueName: \"kubernetes.io/projected/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kube-api-access-rh64k\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373746 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373772 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373798 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.373905 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.374610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.375191 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.375229 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.377470 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.377506 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e3a1c21aa2f31023ffd9d8ce062194bd602191be618555955b2d9607f3d4eb2e/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.380018 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.393528 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/98dd23d5-7a26-4a06-a35a-e818b8feba3c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.393993 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh64k\" (UniqueName: \"kubernetes.io/projected/98dd23d5-7a26-4a06-a35a-e818b8feba3c-kube-api-access-rh64k\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.404206 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8efa47f2-4de9-40d5-87ac-b721e70aafe9\") pod \"openstack-cell1-galera-0\" (UID: \"98dd23d5-7a26-4a06-a35a-e818b8feba3c\") " pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.444156 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:24 crc kubenswrapper[5094]: I0220 08:24:24.856295 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 20 08:24:24 crc kubenswrapper[5094]: W0220 08:24:24.874881 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98dd23d5_7a26_4a06_a35a_e818b8feba3c.slice/crio-7fd2e7d13354b8cdd517191e3dfc6033af95d1eee2076eb99236435f65dd717e WatchSource:0}: Error finding container 7fd2e7d13354b8cdd517191e3dfc6033af95d1eee2076eb99236435f65dd717e: Status 404 returned error can't find the container with id 7fd2e7d13354b8cdd517191e3dfc6033af95d1eee2076eb99236435f65dd717e Feb 20 08:24:25 crc kubenswrapper[5094]: I0220 08:24:25.244010 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98dd23d5-7a26-4a06-a35a-e818b8feba3c","Type":"ContainerStarted","Data":"7fd2e7d13354b8cdd517191e3dfc6033af95d1eee2076eb99236435f65dd717e"} Feb 20 08:24:28 crc kubenswrapper[5094]: I0220 08:24:28.841933 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:28 crc kubenswrapper[5094]: E0220 08:24:28.842115 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:43 crc kubenswrapper[5094]: I0220 08:24:43.840523 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:43 crc kubenswrapper[5094]: E0220 08:24:43.841343 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.537318 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98dd23d5-7a26-4a06-a35a-e818b8feba3c","Type":"ContainerStarted","Data":"a6d1279d9d85f16e430a938ac1dc6735a73ff059dd1b7fd319df3fe9ec5c1713"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.539440 5094 generic.go:334] "Generic (PLEG): container finished" podID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerID="24e6eb261ea32e536bfab420ed66babedb6b616c3c0c6563b11146f4eaf4e89e" exitCode=0 Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.539528 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerDied","Data":"24e6eb261ea32e536bfab420ed66babedb6b616c3c0c6563b11146f4eaf4e89e"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.541198 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5074d037-240e-4685-8c3b-3dd7b963beb0","Type":"ContainerStarted","Data":"0f1c0f233201ef672176a0ccaab58c929e5fb1e955a41efd3b7b5b3a438bb757"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.541411 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.543586 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerID="e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c" exitCode=0 Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.543639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerDied","Data":"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.546269 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"542d99bc-6049-42dc-9036-8a795552e896","Type":"ContainerStarted","Data":"33e61e60c5f24519ad97451c04f3fc6b600a1332b53418633fa0a10d49fe57bb"} Feb 20 08:24:44 crc kubenswrapper[5094]: I0220 08:24:44.639923 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.616130379 podStartE2EDuration="22.639897089s" podCreationTimestamp="2026-02-20 08:24:22 +0000 UTC" firstStartedPulling="2026-02-20 08:24:23.729249946 +0000 UTC m=+5878.601876657" lastFinishedPulling="2026-02-20 08:24:43.753016666 +0000 UTC m=+5898.625643367" observedRunningTime="2026-02-20 08:24:44.618189794 +0000 UTC m=+5899.490816525" watchObservedRunningTime="2026-02-20 08:24:44.639897089 +0000 UTC m=+5899.512523800" Feb 20 08:24:44 crc kubenswrapper[5094]: E0220 08:24:44.767441 5094 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Feb 20 08:24:44 crc kubenswrapper[5094]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/3b479d95-6e62-47d9-9a4f-ae0db08b69f0/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 20 08:24:44 crc kubenswrapper[5094]: > podSandboxID="b8edfc60f02b3a08dfffd5b78149935d5da9920c6ad89968f0a47ed48c3497b0" Feb 20 08:24:44 crc kubenswrapper[5094]: E0220 08:24:44.767622 5094 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 20 08:24:44 crc kubenswrapper[5094]: container &Container{Name:dnsmasq-dns,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-server:f0473f3e18dd17d7021c02e991298923,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-c7bkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-bbf6488d7-47ktg_openstack(3b479d95-6e62-47d9-9a4f-ae0db08b69f0): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/3b479d95-6e62-47d9-9a4f-ae0db08b69f0/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Feb 20 08:24:44 crc kubenswrapper[5094]: > logger="UnhandledError" Feb 20 08:24:44 crc kubenswrapper[5094]: E0220 08:24:44.768817 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/3b479d95-6e62-47d9-9a4f-ae0db08b69f0/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.553658 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerStarted","Data":"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47"} Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.556559 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerStarted","Data":"8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023"} Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.557104 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.559898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerStarted","Data":"6d76e2a1b5f6be82c63ad209d475f9abed3c7fa6699825eafe32cb1a6b659d1b"} Feb 20 08:24:45 crc kubenswrapper[5094]: I0220 08:24:45.660478 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" podStartSLOduration=3.477640103 podStartE2EDuration="25.660455596s" podCreationTimestamp="2026-02-20 08:24:20 +0000 UTC" firstStartedPulling="2026-02-20 08:24:21.525840672 +0000 UTC m=+5876.398467383" lastFinishedPulling="2026-02-20 08:24:43.708656165 +0000 UTC m=+5898.581282876" observedRunningTime="2026-02-20 08:24:45.630597926 +0000 UTC m=+5900.503224627" watchObservedRunningTime="2026-02-20 08:24:45.660455596 +0000 UTC m=+5900.533082317" Feb 20 08:24:46 crc kubenswrapper[5094]: I0220 08:24:46.567538 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerStarted","Data":"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e"} Feb 20 08:24:46 crc kubenswrapper[5094]: I0220 08:24:46.568832 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:46 crc kubenswrapper[5094]: I0220 08:24:46.595012 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" podStartSLOduration=4.061823541 podStartE2EDuration="26.594989529s" podCreationTimestamp="2026-02-20 08:24:20 +0000 UTC" firstStartedPulling="2026-02-20 08:24:21.175521138 +0000 UTC m=+5876.048147849" lastFinishedPulling="2026-02-20 08:24:43.708687126 +0000 UTC m=+5898.581313837" observedRunningTime="2026-02-20 08:24:46.591344861 +0000 UTC m=+5901.463971602" watchObservedRunningTime="2026-02-20 08:24:46.594989529 +0000 UTC m=+5901.467616250" Feb 20 08:24:47 crc kubenswrapper[5094]: I0220 08:24:47.578081 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98dd23d5-7a26-4a06-a35a-e818b8feba3c","Type":"ContainerDied","Data":"a6d1279d9d85f16e430a938ac1dc6735a73ff059dd1b7fd319df3fe9ec5c1713"} Feb 20 08:24:47 crc kubenswrapper[5094]: I0220 08:24:47.577963 5094 generic.go:334] "Generic (PLEG): container finished" podID="98dd23d5-7a26-4a06-a35a-e818b8feba3c" containerID="a6d1279d9d85f16e430a938ac1dc6735a73ff059dd1b7fd319df3fe9ec5c1713" exitCode=0 Feb 20 08:24:47 crc kubenswrapper[5094]: I0220 08:24:47.580507 5094 generic.go:334] "Generic (PLEG): container finished" podID="542d99bc-6049-42dc-9036-8a795552e896" containerID="33e61e60c5f24519ad97451c04f3fc6b600a1332b53418633fa0a10d49fe57bb" exitCode=0 Feb 20 08:24:47 crc kubenswrapper[5094]: I0220 08:24:47.580597 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"542d99bc-6049-42dc-9036-8a795552e896","Type":"ContainerDied","Data":"33e61e60c5f24519ad97451c04f3fc6b600a1332b53418633fa0a10d49fe57bb"} Feb 20 08:24:48 crc kubenswrapper[5094]: I0220 08:24:48.591044 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"542d99bc-6049-42dc-9036-8a795552e896","Type":"ContainerStarted","Data":"a05b6ba9743e2aa1a2a2831e18ff866d108d1cd723ee36b2b5263f8cab5bda9b"} Feb 20 08:24:48 crc kubenswrapper[5094]: I0220 08:24:48.593029 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"98dd23d5-7a26-4a06-a35a-e818b8feba3c","Type":"ContainerStarted","Data":"04cae0d528ed5bf01cb8000560f61cd697745bcdb665cbb5524d94ef584ed691"} Feb 20 08:24:48 crc kubenswrapper[5094]: I0220 08:24:48.616953 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=6.993498583 podStartE2EDuration="27.616934844s" podCreationTimestamp="2026-02-20 08:24:21 +0000 UTC" firstStartedPulling="2026-02-20 08:24:23.090528402 +0000 UTC m=+5877.963155113" lastFinishedPulling="2026-02-20 08:24:43.713964663 +0000 UTC m=+5898.586591374" observedRunningTime="2026-02-20 08:24:48.610996381 +0000 UTC m=+5903.483623172" watchObservedRunningTime="2026-02-20 08:24:48.616934844 +0000 UTC m=+5903.489561555" Feb 20 08:24:48 crc kubenswrapper[5094]: I0220 08:24:48.641096 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=6.719832842 podStartE2EDuration="25.641073346s" podCreationTimestamp="2026-02-20 08:24:23 +0000 UTC" firstStartedPulling="2026-02-20 08:24:24.876408319 +0000 UTC m=+5879.749035030" lastFinishedPulling="2026-02-20 08:24:43.797648823 +0000 UTC m=+5898.670275534" observedRunningTime="2026-02-20 08:24:48.634067097 +0000 UTC m=+5903.506693868" watchObservedRunningTime="2026-02-20 08:24:48.641073346 +0000 UTC m=+5903.513700057" Feb 20 08:24:50 crc kubenswrapper[5094]: I0220 08:24:50.602177 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:51 crc kubenswrapper[5094]: I0220 08:24:51.047878 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:24:51 crc kubenswrapper[5094]: I0220 08:24:51.102445 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:51 crc kubenswrapper[5094]: I0220 08:24:51.613591 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="dnsmasq-dns" containerID="cri-o://ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" gracePeriod=10 Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.053018 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.203263 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") pod \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.203316 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") pod \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.203361 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") pod \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\" (UID: \"3b479d95-6e62-47d9-9a4f-ae0db08b69f0\") " Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.208256 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt" (OuterVolumeSpecName: "kube-api-access-c7bkt") pod "3b479d95-6e62-47d9-9a4f-ae0db08b69f0" (UID: "3b479d95-6e62-47d9-9a4f-ae0db08b69f0"). InnerVolumeSpecName "kube-api-access-c7bkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.238512 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config" (OuterVolumeSpecName: "config") pod "3b479d95-6e62-47d9-9a4f-ae0db08b69f0" (UID: "3b479d95-6e62-47d9-9a4f-ae0db08b69f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.244536 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b479d95-6e62-47d9-9a4f-ae0db08b69f0" (UID: "3b479d95-6e62-47d9-9a4f-ae0db08b69f0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.304558 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7bkt\" (UniqueName: \"kubernetes.io/projected/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-kube-api-access-c7bkt\") on node \"crc\" DevicePath \"\"" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.304595 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.304604 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b479d95-6e62-47d9-9a4f-ae0db08b69f0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632549 5094 generic.go:334] "Generic (PLEG): container finished" podID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerID="ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" exitCode=0 Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerDied","Data":"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e"} Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632622 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" event={"ID":"3b479d95-6e62-47d9-9a4f-ae0db08b69f0","Type":"ContainerDied","Data":"b8edfc60f02b3a08dfffd5b78149935d5da9920c6ad89968f0a47ed48c3497b0"} Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632618 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf6488d7-47ktg" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.632637 5094 scope.go:117] "RemoveContainer" containerID="ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.654464 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.654500 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.668017 5094 scope.go:117] "RemoveContainer" containerID="e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.668532 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.676402 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf6488d7-47ktg"] Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.694096 5094 scope.go:117] "RemoveContainer" containerID="ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" Feb 20 08:24:52 crc kubenswrapper[5094]: E0220 08:24:52.694530 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e\": container with ID starting with ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e not found: ID does not exist" containerID="ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.694593 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e"} err="failed to get container status \"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e\": rpc error: code = NotFound desc = could not find container \"ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e\": container with ID starting with ccb0c715c4c5f62bbb1a26097315e58e086c6a4dc9a1f3f25e16ec5c7afb5d9e not found: ID does not exist" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.694626 5094 scope.go:117] "RemoveContainer" containerID="e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c" Feb 20 08:24:52 crc kubenswrapper[5094]: E0220 08:24:52.695262 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c\": container with ID starting with e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c not found: ID does not exist" containerID="e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c" Feb 20 08:24:52 crc kubenswrapper[5094]: I0220 08:24:52.695304 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c"} err="failed to get container status \"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c\": rpc error: code = NotFound desc = could not find container \"e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c\": container with ID starting with e9f7e3bd4afae9d78543f393ecec74c64af345ea3fff97276a145955cecf9c5c not found: ID does not exist" Feb 20 08:24:53 crc kubenswrapper[5094]: I0220 08:24:53.280478 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 20 08:24:53 crc kubenswrapper[5094]: I0220 08:24:53.857288 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" path="/var/lib/kubelet/pods/3b479d95-6e62-47d9-9a4f-ae0db08b69f0/volumes" Feb 20 08:24:54 crc kubenswrapper[5094]: I0220 08:24:54.444457 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:54 crc kubenswrapper[5094]: I0220 08:24:54.444987 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:54 crc kubenswrapper[5094]: I0220 08:24:54.526672 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:54 crc kubenswrapper[5094]: I0220 08:24:54.721348 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 20 08:24:55 crc kubenswrapper[5094]: I0220 08:24:55.197427 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 20 08:24:55 crc kubenswrapper[5094]: I0220 08:24:55.320292 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 20 08:24:56 crc kubenswrapper[5094]: I0220 08:24:56.840141 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:24:56 crc kubenswrapper[5094]: E0220 08:24:56.840698 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.329388 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:01 crc kubenswrapper[5094]: E0220 08:25:01.332506 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="init" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.332607 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="init" Feb 20 08:25:01 crc kubenswrapper[5094]: E0220 08:25:01.332668 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="dnsmasq-dns" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.332785 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="dnsmasq-dns" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.332983 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b479d95-6e62-47d9-9a4f-ae0db08b69f0" containerName="dnsmasq-dns" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.333521 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.339166 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.343620 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.449256 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.449388 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.551369 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.551526 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.552755 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.572343 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") pod \"root-account-create-update-m2rs9\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.654887 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:01 crc kubenswrapper[5094]: I0220 08:25:01.899540 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:01 crc kubenswrapper[5094]: W0220 08:25:01.901201 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bfb5bd8_1bf4_4080_908d_71dac3301aca.slice/crio-bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266 WatchSource:0}: Error finding container bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266: Status 404 returned error can't find the container with id bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266 Feb 20 08:25:02 crc kubenswrapper[5094]: I0220 08:25:02.719325 5094 generic.go:334] "Generic (PLEG): container finished" podID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" containerID="5fe6cd402f52794a3175518b1d65628a3975facf339971987772005f254a31df" exitCode=0 Feb 20 08:25:02 crc kubenswrapper[5094]: I0220 08:25:02.719385 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m2rs9" event={"ID":"6bfb5bd8-1bf4-4080-908d-71dac3301aca","Type":"ContainerDied","Data":"5fe6cd402f52794a3175518b1d65628a3975facf339971987772005f254a31df"} Feb 20 08:25:02 crc kubenswrapper[5094]: I0220 08:25:02.719447 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m2rs9" event={"ID":"6bfb5bd8-1bf4-4080-908d-71dac3301aca","Type":"ContainerStarted","Data":"bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266"} Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.124628 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.208635 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") pod \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.208744 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") pod \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\" (UID: \"6bfb5bd8-1bf4-4080-908d-71dac3301aca\") " Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.209612 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bfb5bd8-1bf4-4080-908d-71dac3301aca" (UID: "6bfb5bd8-1bf4-4080-908d-71dac3301aca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.217359 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj" (OuterVolumeSpecName: "kube-api-access-rcnbj") pod "6bfb5bd8-1bf4-4080-908d-71dac3301aca" (UID: "6bfb5bd8-1bf4-4080-908d-71dac3301aca"). InnerVolumeSpecName "kube-api-access-rcnbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.310891 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcnbj\" (UniqueName: \"kubernetes.io/projected/6bfb5bd8-1bf4-4080-908d-71dac3301aca-kube-api-access-rcnbj\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.311281 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfb5bd8-1bf4-4080-908d-71dac3301aca-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.740379 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m2rs9" event={"ID":"6bfb5bd8-1bf4-4080-908d-71dac3301aca","Type":"ContainerDied","Data":"bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266"} Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.740437 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdee6ec37699f4bfe82a7731cdde32d50b1a444a033cb9052df957ac0304b266" Feb 20 08:25:04 crc kubenswrapper[5094]: I0220 08:25:04.740935 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m2rs9" Feb 20 08:25:07 crc kubenswrapper[5094]: I0220 08:25:07.841074 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:25:08 crc kubenswrapper[5094]: I0220 08:25:08.087794 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:08 crc kubenswrapper[5094]: I0220 08:25:08.093368 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-m2rs9"] Feb 20 08:25:08 crc kubenswrapper[5094]: I0220 08:25:08.774784 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea"} Feb 20 08:25:09 crc kubenswrapper[5094]: I0220 08:25:09.851204 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" path="/var/lib/kubelet/pods/6bfb5bd8-1bf4-4080-908d-71dac3301aca/volumes" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.113330 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:25:13 crc kubenswrapper[5094]: E0220 08:25:13.114725 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" containerName="mariadb-account-create-update" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.114749 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" containerName="mariadb-account-create-update" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.115082 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bfb5bd8-1bf4-4080-908d-71dac3301aca" containerName="mariadb-account-create-update" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.116213 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.118940 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.121616 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.181054 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.181125 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.282745 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.282865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.284223 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.312369 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") pod \"root-account-create-update-bwdbk\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.454329 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:13 crc kubenswrapper[5094]: I0220 08:25:13.892111 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:25:13 crc kubenswrapper[5094]: W0220 08:25:13.899066 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2665ce_2c09_43f9_8245_ed36e682e1e0.slice/crio-9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24 WatchSource:0}: Error finding container 9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24: Status 404 returned error can't find the container with id 9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24 Feb 20 08:25:14 crc kubenswrapper[5094]: I0220 08:25:14.827468 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" containerID="3a10c0a7a48b4e7a28b7f39fc5231d6ac90168dce54dc2c58472a7fe7bfce49e" exitCode=0 Feb 20 08:25:14 crc kubenswrapper[5094]: I0220 08:25:14.827836 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bwdbk" event={"ID":"5e2665ce-2c09-43f9-8245-ed36e682e1e0","Type":"ContainerDied","Data":"3a10c0a7a48b4e7a28b7f39fc5231d6ac90168dce54dc2c58472a7fe7bfce49e"} Feb 20 08:25:14 crc kubenswrapper[5094]: I0220 08:25:14.827869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bwdbk" event={"ID":"5e2665ce-2c09-43f9-8245-ed36e682e1e0","Type":"ContainerStarted","Data":"9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24"} Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.222948 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.329249 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") pod \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.329478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") pod \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\" (UID: \"5e2665ce-2c09-43f9-8245-ed36e682e1e0\") " Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.331335 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5e2665ce-2c09-43f9-8245-ed36e682e1e0" (UID: "5e2665ce-2c09-43f9-8245-ed36e682e1e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.340964 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n" (OuterVolumeSpecName: "kube-api-access-cll4n") pod "5e2665ce-2c09-43f9-8245-ed36e682e1e0" (UID: "5e2665ce-2c09-43f9-8245-ed36e682e1e0"). InnerVolumeSpecName "kube-api-access-cll4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.432260 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5e2665ce-2c09-43f9-8245-ed36e682e1e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.432312 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cll4n\" (UniqueName: \"kubernetes.io/projected/5e2665ce-2c09-43f9-8245-ed36e682e1e0-kube-api-access-cll4n\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.847287 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bwdbk" event={"ID":"5e2665ce-2c09-43f9-8245-ed36e682e1e0","Type":"ContainerDied","Data":"9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24"} Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.847635 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d532f2bcf1d75bcf507fd6166ca0077f192de90fca1e0670b857de898687b24" Feb 20 08:25:16 crc kubenswrapper[5094]: I0220 08:25:16.847354 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bwdbk" Feb 20 08:25:17 crc kubenswrapper[5094]: I0220 08:25:17.854873 5094 generic.go:334] "Generic (PLEG): container finished" podID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerID="6d76e2a1b5f6be82c63ad209d475f9abed3c7fa6699825eafe32cb1a6b659d1b" exitCode=0 Feb 20 08:25:17 crc kubenswrapper[5094]: I0220 08:25:17.855825 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerDied","Data":"6d76e2a1b5f6be82c63ad209d475f9abed3c7fa6699825eafe32cb1a6b659d1b"} Feb 20 08:25:17 crc kubenswrapper[5094]: I0220 08:25:17.857234 5094 generic.go:334] "Generic (PLEG): container finished" podID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerID="969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47" exitCode=0 Feb 20 08:25:17 crc kubenswrapper[5094]: I0220 08:25:17.857263 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerDied","Data":"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47"} Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.865416 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerStarted","Data":"6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62"} Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.865904 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.867196 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerStarted","Data":"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74"} Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.867394 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.893389 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.457040871 podStartE2EDuration="58.89336666s" podCreationTimestamp="2026-02-20 08:24:20 +0000 UTC" firstStartedPulling="2026-02-20 08:24:22.320765245 +0000 UTC m=+5877.193391946" lastFinishedPulling="2026-02-20 08:24:43.757091024 +0000 UTC m=+5898.629717735" observedRunningTime="2026-02-20 08:25:18.886659878 +0000 UTC m=+5933.759286589" watchObservedRunningTime="2026-02-20 08:25:18.89336666 +0000 UTC m=+5933.765993371" Feb 20 08:25:18 crc kubenswrapper[5094]: I0220 08:25:18.912532 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.906894197 podStartE2EDuration="58.912515592s" podCreationTimestamp="2026-02-20 08:24:20 +0000 UTC" firstStartedPulling="2026-02-20 08:24:22.703060211 +0000 UTC m=+5877.575686922" lastFinishedPulling="2026-02-20 08:24:43.708681606 +0000 UTC m=+5898.581308317" observedRunningTime="2026-02-20 08:25:18.908540586 +0000 UTC m=+5933.781167297" watchObservedRunningTime="2026-02-20 08:25:18.912515592 +0000 UTC m=+5933.785142303" Feb 20 08:25:31 crc kubenswrapper[5094]: I0220 08:25:31.806017 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 08:25:32 crc kubenswrapper[5094]: I0220 08:25:32.228158 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.683409 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:25:39 crc kubenswrapper[5094]: E0220 08:25:39.684399 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" containerName="mariadb-account-create-update" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.684418 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" containerName="mariadb-account-create-update" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.684614 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" containerName="mariadb-account-create-update" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.685680 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.695892 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.841330 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.841641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.841854 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.943995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.944142 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.944300 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.945082 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.945736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:39 crc kubenswrapper[5094]: I0220 08:25:39.973698 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") pod \"dnsmasq-dns-c98fc4f89-fwwbb\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:40 crc kubenswrapper[5094]: I0220 08:25:40.006131 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:40 crc kubenswrapper[5094]: W0220 08:25:40.582788 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f4c545b_01fc_4e08_994c_7d24a10a963e.slice/crio-980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca WatchSource:0}: Error finding container 980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca: Status 404 returned error can't find the container with id 980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca Feb 20 08:25:40 crc kubenswrapper[5094]: I0220 08:25:40.589379 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:25:40 crc kubenswrapper[5094]: I0220 08:25:40.714160 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:41 crc kubenswrapper[5094]: I0220 08:25:41.075265 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerID="6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84" exitCode=0 Feb 20 08:25:41 crc kubenswrapper[5094]: I0220 08:25:41.075313 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerDied","Data":"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84"} Feb 20 08:25:41 crc kubenswrapper[5094]: I0220 08:25:41.075357 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerStarted","Data":"980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca"} Feb 20 08:25:41 crc kubenswrapper[5094]: I0220 08:25:41.620293 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:42 crc kubenswrapper[5094]: I0220 08:25:42.099068 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerStarted","Data":"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3"} Feb 20 08:25:42 crc kubenswrapper[5094]: I0220 08:25:42.099789 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:42 crc kubenswrapper[5094]: I0220 08:25:42.119845 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" podStartSLOduration=3.119827301 podStartE2EDuration="3.119827301s" podCreationTimestamp="2026-02-20 08:25:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:25:42.118672273 +0000 UTC m=+5956.991298984" watchObservedRunningTime="2026-02-20 08:25:42.119827301 +0000 UTC m=+5956.992454012" Feb 20 08:25:42 crc kubenswrapper[5094]: I0220 08:25:42.781224 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="rabbitmq" containerID="cri-o://6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62" gracePeriod=604798 Feb 20 08:25:43 crc kubenswrapper[5094]: I0220 08:25:43.580978 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="rabbitmq" containerID="cri-o://b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" gracePeriod=604799 Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.193955 5094 generic.go:334] "Generic (PLEG): container finished" podID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerID="6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62" exitCode=0 Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.194252 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerDied","Data":"6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62"} Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.360747 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540394 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540466 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540501 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540570 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540588 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540785 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540809 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540850 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.540873 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") pod \"d6404f29-e503-4f82-a2ce-e147c18677a7\" (UID: \"d6404f29-e503-4f82-a2ce-e147c18677a7\") " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.541391 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.541594 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.542155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.546605 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp" (OuterVolumeSpecName: "kube-api-access-n8tnp") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "kube-api-access-n8tnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.548009 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info" (OuterVolumeSpecName: "pod-info") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.553520 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.554107 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1" (OuterVolumeSpecName: "persistence") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.560262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf" (OuterVolumeSpecName: "server-conf") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.623341 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d6404f29-e503-4f82-a2ce-e147c18677a7" (UID: "d6404f29-e503-4f82-a2ce-e147c18677a7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.643442 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") on node \"crc\" " Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644016 5094 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644115 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8tnp\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-kube-api-access-n8tnp\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644138 5094 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d6404f29-e503-4f82-a2ce-e147c18677a7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644188 5094 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d6404f29-e503-4f82-a2ce-e147c18677a7-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644207 5094 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d6404f29-e503-4f82-a2ce-e147c18677a7-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644222 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644276 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.644296 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d6404f29-e503-4f82-a2ce-e147c18677a7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.662776 5094 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.663053 5094 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1") on node "crc" Feb 20 08:25:49 crc kubenswrapper[5094]: I0220 08:25:49.745311 5094 reconciler_common.go:293] "Volume detached for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.008010 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.064016 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.064276 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="dnsmasq-dns" containerID="cri-o://8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023" gracePeriod=10 Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.209053 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d6404f29-e503-4f82-a2ce-e147c18677a7","Type":"ContainerDied","Data":"4b1fc1669c5a53997e1b5722c230d8dba80a44d7f6b4091a5bd28b4938f7b018"} Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.209105 5094 scope.go:117] "RemoveContainer" containerID="6e9951d34d067e6c11ee76a135e218095884e7a51c2dd26ce69a65711a7b7b62" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.209147 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.212787 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.215137 5094 generic.go:334] "Generic (PLEG): container finished" podID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerID="b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" exitCode=0 Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.215226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerDied","Data":"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74"} Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.215258 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"09194bc6-429f-46a1-8dac-8b84385c9e10","Type":"ContainerDied","Data":"9e766a894fc47307f318a9f38ad47bbac110c5ff58c56284d0fae3825eea954c"} Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.219240 5094 generic.go:334] "Generic (PLEG): container finished" podID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerID="8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023" exitCode=0 Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.219279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerDied","Data":"8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023"} Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.236282 5094 scope.go:117] "RemoveContainer" containerID="6d76e2a1b5f6be82c63ad209d475f9abed3c7fa6699825eafe32cb1a6b659d1b" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.263716 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.270281 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.291521 5094 scope.go:117] "RemoveContainer" containerID="b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303351 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.303637 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303649 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.303666 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="setup-container" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303672 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="setup-container" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.303689 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303697 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.303738 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="setup-container" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303747 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="setup-container" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303928 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.303941 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" containerName="rabbitmq" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.304833 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.307059 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.313038 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.313247 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.313418 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-2c4db" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.313467 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.333641 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.334858 5094 scope.go:117] "RemoveContainer" containerID="969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.367178 5094 scope.go:117] "RemoveContainer" containerID="b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.368429 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74\": container with ID starting with b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74 not found: ID does not exist" containerID="b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.368461 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74"} err="failed to get container status \"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74\": rpc error: code = NotFound desc = could not find container \"b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74\": container with ID starting with b3b442baad93ec413f9539526f057ff7813024d9fd4c545cfc22a77adbfd3b74 not found: ID does not exist" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.368484 5094 scope.go:117] "RemoveContainer" containerID="969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47" Feb 20 08:25:50 crc kubenswrapper[5094]: E0220 08:25:50.368733 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47\": container with ID starting with 969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47 not found: ID does not exist" containerID="969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.368757 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47"} err="failed to get container status \"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47\": rpc error: code = NotFound desc = could not find container \"969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47\": container with ID starting with 969469a9c4ed7927decd268446b641c0bc47204103fa4da8c53604820787ff47 not found: ID does not exist" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.373869 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.373932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.373989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374026 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374063 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374264 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374294 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374314 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") pod \"09194bc6-429f-46a1-8dac-8b84385c9e10\" (UID: \"09194bc6-429f-46a1-8dac-8b84385c9e10\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374434 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374463 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374494 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374532 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfn2p\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-kube-api-access-nfn2p\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374570 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374647 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374678 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.374713 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.378169 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.378181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.378802 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.379524 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info" (OuterVolumeSpecName: "pod-info") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.383310 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.390976 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd" (OuterVolumeSpecName: "persistence") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "pvc-219358f9-7520-4512-8729-274ec1ad54bd". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.395083 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s" (OuterVolumeSpecName: "kube-api-access-4qh8s") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "kube-api-access-4qh8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.399928 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf" (OuterVolumeSpecName: "server-conf") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.452966 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "09194bc6-429f-46a1-8dac-8b84385c9e10" (UID: "09194bc6-429f-46a1-8dac-8b84385c9e10"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.462902 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.475527 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") pod \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.475596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") pod \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.475668 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") pod \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\" (UID: \"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9\") " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.475990 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476031 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476060 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476090 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfn2p\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-kube-api-access-nfn2p\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476123 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476185 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476199 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476238 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476281 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qh8s\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-kube-api-access-4qh8s\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476291 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476313 5094 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/09194bc6-429f-46a1-8dac-8b84385c9e10-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476322 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476331 5094 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-server-conf\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476340 5094 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/09194bc6-429f-46a1-8dac-8b84385c9e10-pod-info\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476348 5094 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/09194bc6-429f-46a1-8dac-8b84385c9e10-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476357 5094 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/09194bc6-429f-46a1-8dac-8b84385c9e10-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.476377 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") on node \"crc\" " Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.479159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.483690 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5" (OuterVolumeSpecName: "kube-api-access-9pqx5") pod "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" (UID: "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9"). InnerVolumeSpecName "kube-api-access-9pqx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.483965 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.485459 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.486729 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.487669 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.487694 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cc749f9eaf145df5c444fbae24383d1bdaa4331dffc2f7c6f1445ba7dce2304b/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.488238 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.488764 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.491471 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.505294 5094 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.505466 5094 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-219358f9-7520-4512-8729-274ec1ad54bd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd") on node "crc" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.512771 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfn2p\" (UniqueName: \"kubernetes.io/projected/fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b-kube-api-access-nfn2p\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.529675 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" (UID: "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.533463 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-eadf2c68-d03c-49bf-8f9d-78359eb488e1\") pod \"rabbitmq-server-0\" (UID: \"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b\") " pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.541966 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config" (OuterVolumeSpecName: "config") pod "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" (UID: "9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.577199 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.577233 5094 reconciler_common.go:293] "Volume detached for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.577247 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqx5\" (UniqueName: \"kubernetes.io/projected/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-kube-api-access-9pqx5\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.577257 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.637601 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 20 08:25:50 crc kubenswrapper[5094]: I0220 08:25:50.910681 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.229755 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" event={"ID":"9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9","Type":"ContainerDied","Data":"e19bae85da8585d71a99f124e607e047f6c285504074d9b173d9c3014d6e6d83"} Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.230151 5094 scope.go:117] "RemoveContainer" containerID="8a09f7482da0f94c8e0f012e0e1e52314f35f8500de127da7d88164462490023" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.230026 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76466d7cdc-nc2ng" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.238391 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b","Type":"ContainerStarted","Data":"7b80f9c0dd2c624b48bdf59524151b4fa2295a49aa7e051802bad2d412291ce3"} Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.239444 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.260266 5094 scope.go:117] "RemoveContainer" containerID="24e6eb261ea32e536bfab420ed66babedb6b616c3c0c6563b11146f4eaf4e89e" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.265212 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.284655 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76466d7cdc-nc2ng"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.304379 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.304447 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.322845 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: E0220 08:25:51.323377 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="dnsmasq-dns" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.323462 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="dnsmasq-dns" Feb 20 08:25:51 crc kubenswrapper[5094]: E0220 08:25:51.323538 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="init" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.323601 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="init" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.323811 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" containerName="dnsmasq-dns" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.324594 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.333147 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.336754 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.337054 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.337215 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.337328 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-rs7rx" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.340473 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411467 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411561 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411591 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411617 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411643 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/392a6bbf-c80d-4142-adb2-b4828517b1c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411672 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411785 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpp2v\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-kube-api-access-cpp2v\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.411997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.412084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/392a6bbf-c80d-4142-adb2-b4828517b1c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512574 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpp2v\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-kube-api-access-cpp2v\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/392a6bbf-c80d-4142-adb2-b4828517b1c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512675 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512732 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512773 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.512787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/392a6bbf-c80d-4142-adb2-b4828517b1c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.514863 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.515037 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.515141 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.516042 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/392a6bbf-c80d-4142-adb2-b4828517b1c6-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.517173 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.517272 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c0ed1e37b492902e6baae6f722347d61d8d5759c03a6d0fd94fd84b63621ed84/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.517871 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/392a6bbf-c80d-4142-adb2-b4828517b1c6-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.519155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/392a6bbf-c80d-4142-adb2-b4828517b1c6-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.521791 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.532643 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpp2v\" (UniqueName: \"kubernetes.io/projected/392a6bbf-c80d-4142-adb2-b4828517b1c6-kube-api-access-cpp2v\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.548332 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-219358f9-7520-4512-8729-274ec1ad54bd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-219358f9-7520-4512-8729-274ec1ad54bd\") pod \"rabbitmq-cell1-server-0\" (UID: \"392a6bbf-c80d-4142-adb2-b4828517b1c6\") " pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.684020 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.865534 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09194bc6-429f-46a1-8dac-8b84385c9e10" path="/var/lib/kubelet/pods/09194bc6-429f-46a1-8dac-8b84385c9e10/volumes" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.866846 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9" path="/var/lib/kubelet/pods/9f0a1ae1-64d4-4e2c-9171-0bdb8a87dfb9/volumes" Feb 20 08:25:51 crc kubenswrapper[5094]: I0220 08:25:51.876532 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6404f29-e503-4f82-a2ce-e147c18677a7" path="/var/lib/kubelet/pods/d6404f29-e503-4f82-a2ce-e147c18677a7/volumes" Feb 20 08:25:52 crc kubenswrapper[5094]: I0220 08:25:52.107095 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 20 08:25:52 crc kubenswrapper[5094]: W0220 08:25:52.146101 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod392a6bbf_c80d_4142_adb2_b4828517b1c6.slice/crio-f3fdcbe9418cd588544798e51723a34620d3427c69aa739a6a00bcbcebfd74e9 WatchSource:0}: Error finding container f3fdcbe9418cd588544798e51723a34620d3427c69aa739a6a00bcbcebfd74e9: Status 404 returned error can't find the container with id f3fdcbe9418cd588544798e51723a34620d3427c69aa739a6a00bcbcebfd74e9 Feb 20 08:25:52 crc kubenswrapper[5094]: I0220 08:25:52.256635 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"392a6bbf-c80d-4142-adb2-b4828517b1c6","Type":"ContainerStarted","Data":"f3fdcbe9418cd588544798e51723a34620d3427c69aa739a6a00bcbcebfd74e9"} Feb 20 08:25:53 crc kubenswrapper[5094]: I0220 08:25:53.268069 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b","Type":"ContainerStarted","Data":"ab3945abed45e569ed16f126fb6c69dd16f3c7414312304803aee733c8faae22"} Feb 20 08:25:54 crc kubenswrapper[5094]: I0220 08:25:54.277642 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"392a6bbf-c80d-4142-adb2-b4828517b1c6","Type":"ContainerStarted","Data":"cb2ce7eb067ea29894907db6542d2260151c1caaba71914d8c805a9a260f43dc"} Feb 20 08:26:24 crc kubenswrapper[5094]: I0220 08:26:24.523204 5094 generic.go:334] "Generic (PLEG): container finished" podID="fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b" containerID="ab3945abed45e569ed16f126fb6c69dd16f3c7414312304803aee733c8faae22" exitCode=0 Feb 20 08:26:24 crc kubenswrapper[5094]: I0220 08:26:24.523289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b","Type":"ContainerDied","Data":"ab3945abed45e569ed16f126fb6c69dd16f3c7414312304803aee733c8faae22"} Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.531872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b","Type":"ContainerStarted","Data":"f31fffa9aaf7df7267a9d1b277cc7729ff296a53878bb5b76a0999edb7ddc055"} Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.532334 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.538007 5094 generic.go:334] "Generic (PLEG): container finished" podID="392a6bbf-c80d-4142-adb2-b4828517b1c6" containerID="cb2ce7eb067ea29894907db6542d2260151c1caaba71914d8c805a9a260f43dc" exitCode=0 Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.538049 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"392a6bbf-c80d-4142-adb2-b4828517b1c6","Type":"ContainerDied","Data":"cb2ce7eb067ea29894907db6542d2260151c1caaba71914d8c805a9a260f43dc"} Feb 20 08:26:25 crc kubenswrapper[5094]: I0220 08:26:25.560401 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=35.560382695 podStartE2EDuration="35.560382695s" podCreationTimestamp="2026-02-20 08:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:26:25.557537806 +0000 UTC m=+6000.430164537" watchObservedRunningTime="2026-02-20 08:26:25.560382695 +0000 UTC m=+6000.433009406" Feb 20 08:26:26 crc kubenswrapper[5094]: I0220 08:26:26.545941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"392a6bbf-c80d-4142-adb2-b4828517b1c6","Type":"ContainerStarted","Data":"a02e39cab42ca86cca65460f27340ab2f34d0de07eb22c8b5a3463d05dba2a09"} Feb 20 08:26:26 crc kubenswrapper[5094]: I0220 08:26:26.546648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:26:26 crc kubenswrapper[5094]: I0220 08:26:26.567162 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=35.56714297 podStartE2EDuration="35.56714297s" podCreationTimestamp="2026-02-20 08:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:26:26.564553529 +0000 UTC m=+6001.437180250" watchObservedRunningTime="2026-02-20 08:26:26.56714297 +0000 UTC m=+6001.439769691" Feb 20 08:26:40 crc kubenswrapper[5094]: I0220 08:26:40.641945 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 20 08:26:41 crc kubenswrapper[5094]: I0220 08:26:41.688009 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.678357 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.680191 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.685072 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-d88p7" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.690690 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.870045 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") pod \"mariadb-client\" (UID: \"0a72a034-da88-4003-930b-d4a4843366fa\") " pod="openstack/mariadb-client" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.972082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") pod \"mariadb-client\" (UID: \"0a72a034-da88-4003-930b-d4a4843366fa\") " pod="openstack/mariadb-client" Feb 20 08:26:48 crc kubenswrapper[5094]: I0220 08:26:48.994349 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") pod \"mariadb-client\" (UID: \"0a72a034-da88-4003-930b-d4a4843366fa\") " pod="openstack/mariadb-client" Feb 20 08:26:49 crc kubenswrapper[5094]: I0220 08:26:49.005238 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:26:49 crc kubenswrapper[5094]: I0220 08:26:49.605749 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:26:49 crc kubenswrapper[5094]: W0220 08:26:49.616319 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a72a034_da88_4003_930b_d4a4843366fa.slice/crio-153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054 WatchSource:0}: Error finding container 153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054: Status 404 returned error can't find the container with id 153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054 Feb 20 08:26:49 crc kubenswrapper[5094]: I0220 08:26:49.719872 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0a72a034-da88-4003-930b-d4a4843366fa","Type":"ContainerStarted","Data":"153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054"} Feb 20 08:26:50 crc kubenswrapper[5094]: I0220 08:26:50.729883 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0a72a034-da88-4003-930b-d4a4843366fa","Type":"ContainerStarted","Data":"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b"} Feb 20 08:26:50 crc kubenswrapper[5094]: I0220 08:26:50.749198 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=2.155472409 podStartE2EDuration="2.749177597s" podCreationTimestamp="2026-02-20 08:26:48 +0000 UTC" firstStartedPulling="2026-02-20 08:26:49.618261456 +0000 UTC m=+6024.490888167" lastFinishedPulling="2026-02-20 08:26:50.211966634 +0000 UTC m=+6025.084593355" observedRunningTime="2026-02-20 08:26:50.74188907 +0000 UTC m=+6025.614515791" watchObservedRunningTime="2026-02-20 08:26:50.749177597 +0000 UTC m=+6025.621804308" Feb 20 08:27:03 crc kubenswrapper[5094]: I0220 08:27:03.750971 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:27:03 crc kubenswrapper[5094]: I0220 08:27:03.751875 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="0a72a034-da88-4003-930b-d4a4843366fa" containerName="mariadb-client" containerID="cri-o://ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" gracePeriod=30 Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.295339 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.361138 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") pod \"0a72a034-da88-4003-930b-d4a4843366fa\" (UID: \"0a72a034-da88-4003-930b-d4a4843366fa\") " Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.366404 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227" (OuterVolumeSpecName: "kube-api-access-jk227") pod "0a72a034-da88-4003-930b-d4a4843366fa" (UID: "0a72a034-da88-4003-930b-d4a4843366fa"). InnerVolumeSpecName "kube-api-access-jk227". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.463142 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk227\" (UniqueName: \"kubernetes.io/projected/0a72a034-da88-4003-930b-d4a4843366fa-kube-api-access-jk227\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878391 5094 generic.go:334] "Generic (PLEG): container finished" podID="0a72a034-da88-4003-930b-d4a4843366fa" containerID="ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" exitCode=143 Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878448 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0a72a034-da88-4003-930b-d4a4843366fa","Type":"ContainerDied","Data":"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b"} Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878485 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"0a72a034-da88-4003-930b-d4a4843366fa","Type":"ContainerDied","Data":"153ceebd8012a66fad340f61b2a9cd58fb7be67cb3d3230fe6e8e53eb4b5b054"} Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878494 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.878506 5094 scope.go:117] "RemoveContainer" containerID="ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.896172 5094 scope.go:117] "RemoveContainer" containerID="ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" Feb 20 08:27:04 crc kubenswrapper[5094]: E0220 08:27:04.896809 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b\": container with ID starting with ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b not found: ID does not exist" containerID="ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.896916 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b"} err="failed to get container status \"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b\": rpc error: code = NotFound desc = could not find container \"ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b\": container with ID starting with ea871822d43bdfe462e2b400f50aee8de7fad30859014e952fc2e09e25cb686b not found: ID does not exist" Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.952451 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:27:04 crc kubenswrapper[5094]: I0220 08:27:04.958907 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:27:05 crc kubenswrapper[5094]: I0220 08:27:05.848856 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a72a034-da88-4003-930b-d4a4843366fa" path="/var/lib/kubelet/pods/0a72a034-da88-4003-930b-d4a4843366fa/volumes" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.890492 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:09 crc kubenswrapper[5094]: E0220 08:27:09.891478 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a72a034-da88-4003-930b-d4a4843366fa" containerName="mariadb-client" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.891495 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a72a034-da88-4003-930b-d4a4843366fa" containerName="mariadb-client" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.891721 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a72a034-da88-4003-930b-d4a4843366fa" containerName="mariadb-client" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.893443 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:09 crc kubenswrapper[5094]: I0220 08:27:09.899267 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.052167 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.052236 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.052319 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.153943 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.154005 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.154037 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.154505 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.154536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.174173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") pod \"redhat-marketplace-pn85q\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.224452 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.648384 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.931473 5094 generic.go:334] "Generic (PLEG): container finished" podID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerID="f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625" exitCode=0 Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.931568 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerDied","Data":"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625"} Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.931850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerStarted","Data":"b576f773f6ac06b5034d0d8b7def4a70037f3dd2770948ebe163885be0ac669d"} Feb 20 08:27:10 crc kubenswrapper[5094]: I0220 08:27:10.935766 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:27:11 crc kubenswrapper[5094]: I0220 08:27:11.939434 5094 generic.go:334] "Generic (PLEG): container finished" podID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerID="bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc" exitCode=0 Feb 20 08:27:11 crc kubenswrapper[5094]: I0220 08:27:11.939537 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerDied","Data":"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc"} Feb 20 08:27:12 crc kubenswrapper[5094]: I0220 08:27:12.951459 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerStarted","Data":"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb"} Feb 20 08:27:12 crc kubenswrapper[5094]: I0220 08:27:12.974243 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pn85q" podStartSLOduration=2.5842042259999998 podStartE2EDuration="3.974218s" podCreationTimestamp="2026-02-20 08:27:09 +0000 UTC" firstStartedPulling="2026-02-20 08:27:10.93553939 +0000 UTC m=+6045.808166101" lastFinishedPulling="2026-02-20 08:27:12.325553154 +0000 UTC m=+6047.198179875" observedRunningTime="2026-02-20 08:27:12.969018024 +0000 UTC m=+6047.841644735" watchObservedRunningTime="2026-02-20 08:27:12.974218 +0000 UTC m=+6047.846844711" Feb 20 08:27:20 crc kubenswrapper[5094]: I0220 08:27:20.225758 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:20 crc kubenswrapper[5094]: I0220 08:27:20.226813 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:20 crc kubenswrapper[5094]: I0220 08:27:20.277339 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:21 crc kubenswrapper[5094]: I0220 08:27:21.046232 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:21 crc kubenswrapper[5094]: I0220 08:27:21.090972 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.021308 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pn85q" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="registry-server" containerID="cri-o://e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" gracePeriod=2 Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.534056 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.573459 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") pod \"5727179b-3c74-4339-8714-80c77b9ac4c1\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.573566 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") pod \"5727179b-3c74-4339-8714-80c77b9ac4c1\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.573641 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") pod \"5727179b-3c74-4339-8714-80c77b9ac4c1\" (UID: \"5727179b-3c74-4339-8714-80c77b9ac4c1\") " Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.574607 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities" (OuterVolumeSpecName: "utilities") pod "5727179b-3c74-4339-8714-80c77b9ac4c1" (UID: "5727179b-3c74-4339-8714-80c77b9ac4c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.579342 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7" (OuterVolumeSpecName: "kube-api-access-q75c7") pod "5727179b-3c74-4339-8714-80c77b9ac4c1" (UID: "5727179b-3c74-4339-8714-80c77b9ac4c1"). InnerVolumeSpecName "kube-api-access-q75c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.598168 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5727179b-3c74-4339-8714-80c77b9ac4c1" (UID: "5727179b-3c74-4339-8714-80c77b9ac4c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.674390 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q75c7\" (UniqueName: \"kubernetes.io/projected/5727179b-3c74-4339-8714-80c77b9ac4c1-kube-api-access-q75c7\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.674440 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:23 crc kubenswrapper[5094]: I0220 08:27:23.674452 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5727179b-3c74-4339-8714-80c77b9ac4c1-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:23 crc kubenswrapper[5094]: E0220 08:27:23.941257 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5727179b_3c74_4339_8714_80c77b9ac4c1.slice\": RecentStats: unable to find data in memory cache]" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030178 5094 generic.go:334] "Generic (PLEG): container finished" podID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerID="e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" exitCode=0 Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerDied","Data":"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb"} Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pn85q" event={"ID":"5727179b-3c74-4339-8714-80c77b9ac4c1","Type":"ContainerDied","Data":"b576f773f6ac06b5034d0d8b7def4a70037f3dd2770948ebe163885be0ac669d"} Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030271 5094 scope.go:117] "RemoveContainer" containerID="e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.030300 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pn85q" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.064057 5094 scope.go:117] "RemoveContainer" containerID="bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.065636 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.074125 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pn85q"] Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.097942 5094 scope.go:117] "RemoveContainer" containerID="f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.122479 5094 scope.go:117] "RemoveContainer" containerID="e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" Feb 20 08:27:24 crc kubenswrapper[5094]: E0220 08:27:24.123640 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb\": container with ID starting with e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb not found: ID does not exist" containerID="e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.123802 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb"} err="failed to get container status \"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb\": rpc error: code = NotFound desc = could not find container \"e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb\": container with ID starting with e4fb2d59873364aee450a2596febcda23d8b32cd503f88490dc96e9906051dfb not found: ID does not exist" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.123892 5094 scope.go:117] "RemoveContainer" containerID="bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc" Feb 20 08:27:24 crc kubenswrapper[5094]: E0220 08:27:24.124595 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc\": container with ID starting with bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc not found: ID does not exist" containerID="bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.124662 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc"} err="failed to get container status \"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc\": rpc error: code = NotFound desc = could not find container \"bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc\": container with ID starting with bb31d551d6e8d601a521138e3a8befce1c055afdd56e3413b1abcf1bf22d60fc not found: ID does not exist" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.125039 5094 scope.go:117] "RemoveContainer" containerID="f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625" Feb 20 08:27:24 crc kubenswrapper[5094]: E0220 08:27:24.127074 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625\": container with ID starting with f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625 not found: ID does not exist" containerID="f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625" Feb 20 08:27:24 crc kubenswrapper[5094]: I0220 08:27:24.127132 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625"} err="failed to get container status \"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625\": rpc error: code = NotFound desc = could not find container \"f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625\": container with ID starting with f0ee81ec9752dc160db9c095f7837ea36ac32ee4f2d219c3d951a3e54a76f625 not found: ID does not exist" Feb 20 08:27:25 crc kubenswrapper[5094]: I0220 08:27:25.852496 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" path="/var/lib/kubelet/pods/5727179b-3c74-4339-8714-80c77b9ac4c1/volumes" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.469747 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:26 crc kubenswrapper[5094]: E0220 08:27:26.470034 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="extract-utilities" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.470046 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="extract-utilities" Feb 20 08:27:26 crc kubenswrapper[5094]: E0220 08:27:26.470062 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="registry-server" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.470068 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="registry-server" Feb 20 08:27:26 crc kubenswrapper[5094]: E0220 08:27:26.470090 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="extract-content" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.470096 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="extract-content" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.470242 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5727179b-3c74-4339-8714-80c77b9ac4c1" containerName="registry-server" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.471211 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.504978 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.523130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.523202 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.523330 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625116 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625191 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625271 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625814 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.625868 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.642749 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") pod \"community-operators-h6g8l\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:26 crc kubenswrapper[5094]: I0220 08:27:26.806310 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:27 crc kubenswrapper[5094]: I0220 08:27:27.087677 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:28 crc kubenswrapper[5094]: I0220 08:27:28.065667 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerID="2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354" exitCode=0 Feb 20 08:27:28 crc kubenswrapper[5094]: I0220 08:27:28.065766 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerDied","Data":"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354"} Feb 20 08:27:28 crc kubenswrapper[5094]: I0220 08:27:28.066278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerStarted","Data":"332979b6e60754867c0fa91afe6bdeb22d4fbcd58e4acb21f601fa64b4983b95"} Feb 20 08:27:29 crc kubenswrapper[5094]: I0220 08:27:29.078641 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerStarted","Data":"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91"} Feb 20 08:27:30 crc kubenswrapper[5094]: I0220 08:27:30.087599 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerID="e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91" exitCode=0 Feb 20 08:27:30 crc kubenswrapper[5094]: I0220 08:27:30.087644 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerDied","Data":"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91"} Feb 20 08:27:31 crc kubenswrapper[5094]: I0220 08:27:31.097255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerStarted","Data":"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff"} Feb 20 08:27:31 crc kubenswrapper[5094]: I0220 08:27:31.128518 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h6g8l" podStartSLOduration=2.495333651 podStartE2EDuration="5.128498178s" podCreationTimestamp="2026-02-20 08:27:26 +0000 UTC" firstStartedPulling="2026-02-20 08:27:28.068744001 +0000 UTC m=+6062.941370712" lastFinishedPulling="2026-02-20 08:27:30.701908508 +0000 UTC m=+6065.574535239" observedRunningTime="2026-02-20 08:27:31.1198877 +0000 UTC m=+6065.992514421" watchObservedRunningTime="2026-02-20 08:27:31.128498178 +0000 UTC m=+6066.001124889" Feb 20 08:27:34 crc kubenswrapper[5094]: I0220 08:27:34.106414 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:27:34 crc kubenswrapper[5094]: I0220 08:27:34.106984 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:27:36 crc kubenswrapper[5094]: I0220 08:27:36.806686 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:36 crc kubenswrapper[5094]: I0220 08:27:36.807195 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:36 crc kubenswrapper[5094]: I0220 08:27:36.853210 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:37 crc kubenswrapper[5094]: I0220 08:27:37.182450 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:37 crc kubenswrapper[5094]: I0220 08:27:37.228942 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.156180 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h6g8l" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="registry-server" containerID="cri-o://2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" gracePeriod=2 Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.606973 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.633150 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") pod \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.633276 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") pod \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.633305 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") pod \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\" (UID: \"40cdda9b-f8ca-4731-b2ab-e982c3ec6893\") " Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.634467 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities" (OuterVolumeSpecName: "utilities") pod "40cdda9b-f8ca-4731-b2ab-e982c3ec6893" (UID: "40cdda9b-f8ca-4731-b2ab-e982c3ec6893"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.639927 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97" (OuterVolumeSpecName: "kube-api-access-rmt97") pod "40cdda9b-f8ca-4731-b2ab-e982c3ec6893" (UID: "40cdda9b-f8ca-4731-b2ab-e982c3ec6893"). InnerVolumeSpecName "kube-api-access-rmt97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.689309 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40cdda9b-f8ca-4731-b2ab-e982c3ec6893" (UID: "40cdda9b-f8ca-4731-b2ab-e982c3ec6893"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.734751 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmt97\" (UniqueName: \"kubernetes.io/projected/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-kube-api-access-rmt97\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.734792 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:39 crc kubenswrapper[5094]: I0220 08:27:39.734804 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cdda9b-f8ca-4731-b2ab-e982c3ec6893-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165180 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerID="2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" exitCode=0 Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165229 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerDied","Data":"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff"} Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165265 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h6g8l" event={"ID":"40cdda9b-f8ca-4731-b2ab-e982c3ec6893","Type":"ContainerDied","Data":"332979b6e60754867c0fa91afe6bdeb22d4fbcd58e4acb21f601fa64b4983b95"} Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165290 5094 scope.go:117] "RemoveContainer" containerID="2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.165230 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h6g8l" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.208767 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.208926 5094 scope.go:117] "RemoveContainer" containerID="e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.215941 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h6g8l"] Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.265871 5094 scope.go:117] "RemoveContainer" containerID="2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.293040 5094 scope.go:117] "RemoveContainer" containerID="2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" Feb 20 08:27:40 crc kubenswrapper[5094]: E0220 08:27:40.293506 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff\": container with ID starting with 2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff not found: ID does not exist" containerID="2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.293545 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff"} err="failed to get container status \"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff\": rpc error: code = NotFound desc = could not find container \"2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff\": container with ID starting with 2806195696dd702732fcb70c252a5dd4f41a8b33afe3f5188a6e81145e00ceff not found: ID does not exist" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.293590 5094 scope.go:117] "RemoveContainer" containerID="e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91" Feb 20 08:27:40 crc kubenswrapper[5094]: E0220 08:27:40.294011 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91\": container with ID starting with e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91 not found: ID does not exist" containerID="e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.294039 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91"} err="failed to get container status \"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91\": rpc error: code = NotFound desc = could not find container \"e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91\": container with ID starting with e3f0e12048f8f4b710bef856e9eb70b75648660c4f2f0c2507d4f0e195400f91 not found: ID does not exist" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.294059 5094 scope.go:117] "RemoveContainer" containerID="2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354" Feb 20 08:27:40 crc kubenswrapper[5094]: E0220 08:27:40.294334 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354\": container with ID starting with 2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354 not found: ID does not exist" containerID="2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354" Feb 20 08:27:40 crc kubenswrapper[5094]: I0220 08:27:40.294358 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354"} err="failed to get container status \"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354\": rpc error: code = NotFound desc = could not find container \"2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354\": container with ID starting with 2b2fefeb49093ee36074ed06293dc186b80167f359041c96cb12ca4d929a9354 not found: ID does not exist" Feb 20 08:27:41 crc kubenswrapper[5094]: I0220 08:27:41.852728 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" path="/var/lib/kubelet/pods/40cdda9b-f8ca-4731-b2ab-e982c3ec6893/volumes" Feb 20 08:28:04 crc kubenswrapper[5094]: I0220 08:28:04.107337 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:28:04 crc kubenswrapper[5094]: I0220 08:28:04.108067 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.107586 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.108536 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.108625 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.109856 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.109971 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea" gracePeriod=600 Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.655253 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea" exitCode=0 Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.655337 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea"} Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.655994 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050"} Feb 20 08:28:34 crc kubenswrapper[5094]: I0220 08:28:34.656017 5094 scope.go:117] "RemoveContainer" containerID="5f9be97697920a5242f67d50b707782c78e3e8fdb6a7078d972293613f9a8bda" Feb 20 08:28:36 crc kubenswrapper[5094]: I0220 08:28:36.459116 5094 scope.go:117] "RemoveContainer" containerID="dc0e175dcf3ab875f0111e29b9804a9472d9627cddd8835f9c61529f34f1c8d3" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.088557 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:43 crc kubenswrapper[5094]: E0220 08:29:43.089978 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="extract-utilities" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.090002 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="extract-utilities" Feb 20 08:29:43 crc kubenswrapper[5094]: E0220 08:29:43.090021 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="registry-server" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.090032 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="registry-server" Feb 20 08:29:43 crc kubenswrapper[5094]: E0220 08:29:43.090045 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="extract-content" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.090057 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="extract-content" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.090298 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cdda9b-f8ca-4731-b2ab-e982c3ec6893" containerName="registry-server" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.092006 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.105396 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.178311 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.178403 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.178446 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.279422 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.279515 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.279557 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.280418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.280512 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.303607 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") pod \"certified-operators-cmrpp\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.420861 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:43 crc kubenswrapper[5094]: I0220 08:29:43.925798 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:44 crc kubenswrapper[5094]: I0220 08:29:44.286217 5094 generic.go:334] "Generic (PLEG): container finished" podID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerID="939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6" exitCode=0 Feb 20 08:29:44 crc kubenswrapper[5094]: I0220 08:29:44.286279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerDied","Data":"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6"} Feb 20 08:29:44 crc kubenswrapper[5094]: I0220 08:29:44.286317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerStarted","Data":"02149a89d5d5d34e333a8506120c9c4e6feb6b6b4482074736c18109ec9c31ae"} Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.279813 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.281556 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.295481 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerStarted","Data":"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d"} Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.296085 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.415753 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.415856 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.416678 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.518463 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.518565 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.518611 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.519208 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.519222 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.542228 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") pod \"redhat-operators-jjk54\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:45 crc kubenswrapper[5094]: I0220 08:29:45.658851 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:46 crc kubenswrapper[5094]: I0220 08:29:46.140112 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:29:46 crc kubenswrapper[5094]: W0220 08:29:46.144123 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod386a80fc_f69f_4bc8_bc43_8c3eba784c4e.slice/crio-9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b WatchSource:0}: Error finding container 9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b: Status 404 returned error can't find the container with id 9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b Feb 20 08:29:46 crc kubenswrapper[5094]: I0220 08:29:46.308773 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerStarted","Data":"9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b"} Feb 20 08:29:46 crc kubenswrapper[5094]: I0220 08:29:46.313661 5094 generic.go:334] "Generic (PLEG): container finished" podID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerID="431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d" exitCode=0 Feb 20 08:29:46 crc kubenswrapper[5094]: I0220 08:29:46.313736 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerDied","Data":"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d"} Feb 20 08:29:47 crc kubenswrapper[5094]: I0220 08:29:47.324865 5094 generic.go:334] "Generic (PLEG): container finished" podID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerID="3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54" exitCode=0 Feb 20 08:29:47 crc kubenswrapper[5094]: I0220 08:29:47.324969 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerDied","Data":"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54"} Feb 20 08:29:47 crc kubenswrapper[5094]: I0220 08:29:47.328849 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerStarted","Data":"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5"} Feb 20 08:29:47 crc kubenswrapper[5094]: I0220 08:29:47.368895 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cmrpp" podStartSLOduration=1.936447024 podStartE2EDuration="4.368876419s" podCreationTimestamp="2026-02-20 08:29:43 +0000 UTC" firstStartedPulling="2026-02-20 08:29:44.288372831 +0000 UTC m=+6199.160999572" lastFinishedPulling="2026-02-20 08:29:46.720802266 +0000 UTC m=+6201.593428967" observedRunningTime="2026-02-20 08:29:47.366154833 +0000 UTC m=+6202.238781544" watchObservedRunningTime="2026-02-20 08:29:47.368876419 +0000 UTC m=+6202.241503130" Feb 20 08:29:48 crc kubenswrapper[5094]: I0220 08:29:48.336351 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerStarted","Data":"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733"} Feb 20 08:29:49 crc kubenswrapper[5094]: I0220 08:29:49.346169 5094 generic.go:334] "Generic (PLEG): container finished" podID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerID="333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733" exitCode=0 Feb 20 08:29:49 crc kubenswrapper[5094]: I0220 08:29:49.346303 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerDied","Data":"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733"} Feb 20 08:29:50 crc kubenswrapper[5094]: I0220 08:29:50.354731 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerStarted","Data":"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125"} Feb 20 08:29:50 crc kubenswrapper[5094]: I0220 08:29:50.370791 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jjk54" podStartSLOduration=2.99574619 podStartE2EDuration="5.370772458s" podCreationTimestamp="2026-02-20 08:29:45 +0000 UTC" firstStartedPulling="2026-02-20 08:29:47.327000227 +0000 UTC m=+6202.199626938" lastFinishedPulling="2026-02-20 08:29:49.702026495 +0000 UTC m=+6204.574653206" observedRunningTime="2026-02-20 08:29:50.369244631 +0000 UTC m=+6205.241871342" watchObservedRunningTime="2026-02-20 08:29:50.370772458 +0000 UTC m=+6205.243399179" Feb 20 08:29:53 crc kubenswrapper[5094]: I0220 08:29:53.421881 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:53 crc kubenswrapper[5094]: I0220 08:29:53.422198 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:53 crc kubenswrapper[5094]: I0220 08:29:53.479128 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:54 crc kubenswrapper[5094]: I0220 08:29:54.439266 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:54 crc kubenswrapper[5094]: I0220 08:29:54.871347 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:55 crc kubenswrapper[5094]: I0220 08:29:55.659883 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:55 crc kubenswrapper[5094]: I0220 08:29:55.659936 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.409408 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cmrpp" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="registry-server" containerID="cri-o://eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" gracePeriod=2 Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.704903 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jjk54" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" probeResult="failure" output=< Feb 20 08:29:56 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:29:56 crc kubenswrapper[5094]: > Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.812079 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.988462 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") pod \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.988601 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") pod \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.988633 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") pod \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\" (UID: \"a9e3f247-149c-4eb7-9eff-7c13eb87a975\") " Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.990532 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities" (OuterVolumeSpecName: "utilities") pod "a9e3f247-149c-4eb7-9eff-7c13eb87a975" (UID: "a9e3f247-149c-4eb7-9eff-7c13eb87a975"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:29:56 crc kubenswrapper[5094]: I0220 08:29:56.994979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t" (OuterVolumeSpecName: "kube-api-access-fd74t") pod "a9e3f247-149c-4eb7-9eff-7c13eb87a975" (UID: "a9e3f247-149c-4eb7-9eff-7c13eb87a975"). InnerVolumeSpecName "kube-api-access-fd74t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.040764 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9e3f247-149c-4eb7-9eff-7c13eb87a975" (UID: "a9e3f247-149c-4eb7-9eff-7c13eb87a975"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.090680 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.090734 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9e3f247-149c-4eb7-9eff-7c13eb87a975-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.090745 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd74t\" (UniqueName: \"kubernetes.io/projected/a9e3f247-149c-4eb7-9eff-7c13eb87a975-kube-api-access-fd74t\") on node \"crc\" DevicePath \"\"" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417805 5094 generic.go:334] "Generic (PLEG): container finished" podID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerID="eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" exitCode=0 Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417844 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerDied","Data":"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5"} Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417875 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cmrpp" event={"ID":"a9e3f247-149c-4eb7-9eff-7c13eb87a975","Type":"ContainerDied","Data":"02149a89d5d5d34e333a8506120c9c4e6feb6b6b4482074736c18109ec9c31ae"} Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417892 5094 scope.go:117] "RemoveContainer" containerID="eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.417906 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cmrpp" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.445837 5094 scope.go:117] "RemoveContainer" containerID="431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.458446 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.467408 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cmrpp"] Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.472487 5094 scope.go:117] "RemoveContainer" containerID="939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.509415 5094 scope.go:117] "RemoveContainer" containerID="eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" Feb 20 08:29:57 crc kubenswrapper[5094]: E0220 08:29:57.509842 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5\": container with ID starting with eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5 not found: ID does not exist" containerID="eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.509897 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5"} err="failed to get container status \"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5\": rpc error: code = NotFound desc = could not find container \"eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5\": container with ID starting with eac50eda1d7a67c01d32002b172303538da2343da19455ed4ab428e978e625d5 not found: ID does not exist" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.509925 5094 scope.go:117] "RemoveContainer" containerID="431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d" Feb 20 08:29:57 crc kubenswrapper[5094]: E0220 08:29:57.510215 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d\": container with ID starting with 431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d not found: ID does not exist" containerID="431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.510235 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d"} err="failed to get container status \"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d\": rpc error: code = NotFound desc = could not find container \"431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d\": container with ID starting with 431a40c5b8a92aa0e0b0d78933c1736fb71dc55652241da39e6461ff6fd59d6d not found: ID does not exist" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.510247 5094 scope.go:117] "RemoveContainer" containerID="939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6" Feb 20 08:29:57 crc kubenswrapper[5094]: E0220 08:29:57.510807 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6\": container with ID starting with 939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6 not found: ID does not exist" containerID="939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.510847 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6"} err="failed to get container status \"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6\": rpc error: code = NotFound desc = could not find container \"939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6\": container with ID starting with 939730f7eeb5ef990c74f994b9464fafb5722a575bcf5c0dbc9afd8f6d7709e6 not found: ID does not exist" Feb 20 08:29:57 crc kubenswrapper[5094]: I0220 08:29:57.855877 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" path="/var/lib/kubelet/pods/a9e3f247-149c-4eb7-9eff-7c13eb87a975/volumes" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.162219 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 08:30:00 crc kubenswrapper[5094]: E0220 08:30:00.163088 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="extract-utilities" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.163128 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="extract-utilities" Feb 20 08:30:00 crc kubenswrapper[5094]: E0220 08:30:00.163187 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="registry-server" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.163198 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="registry-server" Feb 20 08:30:00 crc kubenswrapper[5094]: E0220 08:30:00.163230 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="extract-content" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.163239 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="extract-content" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.163449 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e3f247-149c-4eb7-9eff-7c13eb87a975" containerName="registry-server" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.164227 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.167612 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.167895 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.177184 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.350462 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.350557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.350641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.453197 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.453601 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.453621 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.455233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.464320 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.470283 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") pod \"collect-profiles-29526270-cm5bh\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.496640 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:00 crc kubenswrapper[5094]: I0220 08:30:00.718327 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 08:30:01 crc kubenswrapper[5094]: I0220 08:30:01.458663 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" containerID="89a68b3798c7e61a71c5a1f766e1642edc8983858caba5c4db74959c3a8cdcec" exitCode=0 Feb 20 08:30:01 crc kubenswrapper[5094]: I0220 08:30:01.458760 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" event={"ID":"5c7c75bd-9812-4d90-80ea-08eda0f926fc","Type":"ContainerDied","Data":"89a68b3798c7e61a71c5a1f766e1642edc8983858caba5c4db74959c3a8cdcec"} Feb 20 08:30:01 crc kubenswrapper[5094]: I0220 08:30:01.458793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" event={"ID":"5c7c75bd-9812-4d90-80ea-08eda0f926fc","Type":"ContainerStarted","Data":"fd34be79f2c5a696df0c96f7a0bfdd55ee04248c01f08074b0c8cbd32ebf1e54"} Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.771585 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.792422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") pod \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.792468 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") pod \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.792536 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") pod \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\" (UID: \"5c7c75bd-9812-4d90-80ea-08eda0f926fc\") " Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.793577 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c7c75bd-9812-4d90-80ea-08eda0f926fc" (UID: "5c7c75bd-9812-4d90-80ea-08eda0f926fc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.799061 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c7c75bd-9812-4d90-80ea-08eda0f926fc" (UID: "5c7c75bd-9812-4d90-80ea-08eda0f926fc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.799107 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf" (OuterVolumeSpecName: "kube-api-access-46mmf") pod "5c7c75bd-9812-4d90-80ea-08eda0f926fc" (UID: "5c7c75bd-9812-4d90-80ea-08eda0f926fc"). InnerVolumeSpecName "kube-api-access-46mmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.894132 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7c75bd-9812-4d90-80ea-08eda0f926fc-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.894184 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46mmf\" (UniqueName: \"kubernetes.io/projected/5c7c75bd-9812-4d90-80ea-08eda0f926fc-kube-api-access-46mmf\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:02 crc kubenswrapper[5094]: I0220 08:30:02.894196 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7c75bd-9812-4d90-80ea-08eda0f926fc-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.476119 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" event={"ID":"5c7c75bd-9812-4d90-80ea-08eda0f926fc","Type":"ContainerDied","Data":"fd34be79f2c5a696df0c96f7a0bfdd55ee04248c01f08074b0c8cbd32ebf1e54"} Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.476158 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd34be79f2c5a696df0c96f7a0bfdd55ee04248c01f08074b0c8cbd32ebf1e54" Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.476220 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh" Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.857889 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 08:30:03 crc kubenswrapper[5094]: I0220 08:30:03.868048 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526225-wq7wz"] Feb 20 08:30:05 crc kubenswrapper[5094]: I0220 08:30:05.716929 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:30:05 crc kubenswrapper[5094]: I0220 08:30:05.785234 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:30:05 crc kubenswrapper[5094]: I0220 08:30:05.849361 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c1d2dad-446d-40c2-aceb-de13411f5c93" path="/var/lib/kubelet/pods/1c1d2dad-446d-40c2-aceb-de13411f5c93/volumes" Feb 20 08:30:05 crc kubenswrapper[5094]: I0220 08:30:05.961341 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:30:07 crc kubenswrapper[5094]: I0220 08:30:07.512947 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jjk54" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" containerID="cri-o://abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" gracePeriod=2 Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.169615 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.181989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") pod \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.182131 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") pod \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.182156 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") pod \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\" (UID: \"386a80fc-f69f-4bc8-bc43-8c3eba784c4e\") " Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.183175 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities" (OuterVolumeSpecName: "utilities") pod "386a80fc-f69f-4bc8-bc43-8c3eba784c4e" (UID: "386a80fc-f69f-4bc8-bc43-8c3eba784c4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.190207 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6" (OuterVolumeSpecName: "kube-api-access-f9vr6") pod "386a80fc-f69f-4bc8-bc43-8c3eba784c4e" (UID: "386a80fc-f69f-4bc8-bc43-8c3eba784c4e"). InnerVolumeSpecName "kube-api-access-f9vr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.284262 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9vr6\" (UniqueName: \"kubernetes.io/projected/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-kube-api-access-f9vr6\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.284631 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.311491 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "386a80fc-f69f-4bc8-bc43-8c3eba784c4e" (UID: "386a80fc-f69f-4bc8-bc43-8c3eba784c4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.385495 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/386a80fc-f69f-4bc8-bc43-8c3eba784c4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.524896 5094 generic.go:334] "Generic (PLEG): container finished" podID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerID="abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" exitCode=0 Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.524948 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jjk54" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.524970 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerDied","Data":"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125"} Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.526186 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jjk54" event={"ID":"386a80fc-f69f-4bc8-bc43-8c3eba784c4e","Type":"ContainerDied","Data":"9e5e9a40fac3a47fbd176299b18411e902d13de4ae6e824acdcc6a2fb66c7a0b"} Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.526243 5094 scope.go:117] "RemoveContainer" containerID="abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.558259 5094 scope.go:117] "RemoveContainer" containerID="333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.560456 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.577567 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jjk54"] Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.597044 5094 scope.go:117] "RemoveContainer" containerID="3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.631388 5094 scope.go:117] "RemoveContainer" containerID="abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" Feb 20 08:30:08 crc kubenswrapper[5094]: E0220 08:30:08.635514 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125\": container with ID starting with abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125 not found: ID does not exist" containerID="abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.635579 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125"} err="failed to get container status \"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125\": rpc error: code = NotFound desc = could not find container \"abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125\": container with ID starting with abe35586d5a9ae8d903350883629e148cd91b6ae8db2c363e0547eb2e6187125 not found: ID does not exist" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.635617 5094 scope.go:117] "RemoveContainer" containerID="333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733" Feb 20 08:30:08 crc kubenswrapper[5094]: E0220 08:30:08.636271 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733\": container with ID starting with 333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733 not found: ID does not exist" containerID="333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.636447 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733"} err="failed to get container status \"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733\": rpc error: code = NotFound desc = could not find container \"333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733\": container with ID starting with 333f6a9a58287ad4b807a6cac2d3e9ce998c6f00485066a36639c0b4c3e5c733 not found: ID does not exist" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.636626 5094 scope.go:117] "RemoveContainer" containerID="3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54" Feb 20 08:30:08 crc kubenswrapper[5094]: E0220 08:30:08.637241 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54\": container with ID starting with 3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54 not found: ID does not exist" containerID="3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54" Feb 20 08:30:08 crc kubenswrapper[5094]: I0220 08:30:08.637281 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54"} err="failed to get container status \"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54\": rpc error: code = NotFound desc = could not find container \"3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54\": container with ID starting with 3a733e5c205009cb2fd844f7de4a834d196c2b13a86d56a1de95a1ff048bfa54 not found: ID does not exist" Feb 20 08:30:09 crc kubenswrapper[5094]: I0220 08:30:09.850506 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" path="/var/lib/kubelet/pods/386a80fc-f69f-4bc8-bc43-8c3eba784c4e/volumes" Feb 20 08:30:34 crc kubenswrapper[5094]: I0220 08:30:34.107524 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:30:34 crc kubenswrapper[5094]: I0220 08:30:34.108106 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:30:36 crc kubenswrapper[5094]: I0220 08:30:36.558408 5094 scope.go:117] "RemoveContainer" containerID="362eb9911b24b0b4b251a0b53f012f2ce0407b0cfe5ced9cb86222665657d8eb" Feb 20 08:31:04 crc kubenswrapper[5094]: I0220 08:31:04.107341 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:31:04 crc kubenswrapper[5094]: I0220 08:31:04.108000 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.106825 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.107528 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.107596 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.108411 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.108494 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" gracePeriod=600 Feb 20 08:31:34 crc kubenswrapper[5094]: E0220 08:31:34.235043 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.246877 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" exitCode=0 Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.246936 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050"} Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.247045 5094 scope.go:117] "RemoveContainer" containerID="3240a62ab36af910255d2e2cb0810a1e622f0e56dc51a3b3a6a36dc8406c37ea" Feb 20 08:31:34 crc kubenswrapper[5094]: I0220 08:31:34.248371 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:31:34 crc kubenswrapper[5094]: E0220 08:31:34.248965 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:31:36 crc kubenswrapper[5094]: I0220 08:31:36.641936 5094 scope.go:117] "RemoveContainer" containerID="5fe6cd402f52794a3175518b1d65628a3975facf339971987772005f254a31df" Feb 20 08:31:45 crc kubenswrapper[5094]: I0220 08:31:45.840838 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:31:45 crc kubenswrapper[5094]: E0220 08:31:45.841663 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:31:58 crc kubenswrapper[5094]: I0220 08:31:58.840953 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:31:58 crc kubenswrapper[5094]: E0220 08:31:58.842335 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:32:11 crc kubenswrapper[5094]: I0220 08:32:11.840836 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:32:11 crc kubenswrapper[5094]: E0220 08:32:11.841886 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:32:23 crc kubenswrapper[5094]: I0220 08:32:23.841232 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:32:23 crc kubenswrapper[5094]: E0220 08:32:23.842259 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:32:38 crc kubenswrapper[5094]: I0220 08:32:38.839977 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:32:38 crc kubenswrapper[5094]: E0220 08:32:38.840691 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:32:50 crc kubenswrapper[5094]: I0220 08:32:50.840258 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:32:50 crc kubenswrapper[5094]: E0220 08:32:50.841054 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:02 crc kubenswrapper[5094]: I0220 08:33:02.840425 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:02 crc kubenswrapper[5094]: E0220 08:33:02.841288 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:13 crc kubenswrapper[5094]: I0220 08:33:13.840574 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:13 crc kubenswrapper[5094]: E0220 08:33:13.841862 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:24 crc kubenswrapper[5094]: I0220 08:33:24.840296 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:24 crc kubenswrapper[5094]: E0220 08:33:24.841029 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:35 crc kubenswrapper[5094]: I0220 08:33:35.849047 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:35 crc kubenswrapper[5094]: E0220 08:33:35.850201 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:33:50 crc kubenswrapper[5094]: I0220 08:33:50.839946 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:33:50 crc kubenswrapper[5094]: E0220 08:33:50.840620 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:04 crc kubenswrapper[5094]: I0220 08:34:04.840298 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:04 crc kubenswrapper[5094]: E0220 08:34:04.841084 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:08 crc kubenswrapper[5094]: I0220 08:34:08.068895 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-6d969f468d-fd5gv" podUID="059e3724-d657-4f2e-beec-f4f55e09e498" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.47:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:34:19 crc kubenswrapper[5094]: I0220 08:34:19.840829 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:19 crc kubenswrapper[5094]: E0220 08:34:19.841780 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:31 crc kubenswrapper[5094]: I0220 08:34:31.841151 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:31 crc kubenswrapper[5094]: E0220 08:34:31.842874 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:45 crc kubenswrapper[5094]: I0220 08:34:45.848363 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:45 crc kubenswrapper[5094]: E0220 08:34:45.849449 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:34:57 crc kubenswrapper[5094]: I0220 08:34:57.840586 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:34:57 crc kubenswrapper[5094]: E0220 08:34:57.842388 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:35:09 crc kubenswrapper[5094]: I0220 08:35:09.840759 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:35:09 crc kubenswrapper[5094]: E0220 08:35:09.841675 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:35:17 crc kubenswrapper[5094]: I0220 08:35:17.086258 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:35:17 crc kubenswrapper[5094]: I0220 08:35:17.095846 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bwdbk"] Feb 20 08:35:17 crc kubenswrapper[5094]: I0220 08:35:17.850558 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e2665ce-2c09-43f9-8245-ed36e682e1e0" path="/var/lib/kubelet/pods/5e2665ce-2c09-43f9-8245-ed36e682e1e0/volumes" Feb 20 08:35:23 crc kubenswrapper[5094]: I0220 08:35:23.839943 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:35:23 crc kubenswrapper[5094]: E0220 08:35:23.840916 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:35:36 crc kubenswrapper[5094]: I0220 08:35:36.746830 5094 scope.go:117] "RemoveContainer" containerID="3a10c0a7a48b4e7a28b7f39fc5231d6ac90168dce54dc2c58472a7fe7bfce49e" Feb 20 08:35:37 crc kubenswrapper[5094]: I0220 08:35:37.841064 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:35:37 crc kubenswrapper[5094]: E0220 08:35:37.841475 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:35:52 crc kubenswrapper[5094]: I0220 08:35:52.840825 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:35:52 crc kubenswrapper[5094]: E0220 08:35:52.841674 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:36:04 crc kubenswrapper[5094]: I0220 08:36:04.840587 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:36:04 crc kubenswrapper[5094]: E0220 08:36:04.841431 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:36:16 crc kubenswrapper[5094]: I0220 08:36:16.842915 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:36:16 crc kubenswrapper[5094]: E0220 08:36:16.844592 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:36:30 crc kubenswrapper[5094]: I0220 08:36:30.840794 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:36:30 crc kubenswrapper[5094]: E0220 08:36:30.841920 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:36:43 crc kubenswrapper[5094]: I0220 08:36:43.840387 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:36:44 crc kubenswrapper[5094]: I0220 08:36:44.858298 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb"} Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.984881 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 08:37:17 crc kubenswrapper[5094]: E0220 08:37:17.985682 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="extract-content" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985693 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="extract-content" Feb 20 08:37:17 crc kubenswrapper[5094]: E0220 08:37:17.985742 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="extract-utilities" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985752 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="extract-utilities" Feb 20 08:37:17 crc kubenswrapper[5094]: E0220 08:37:17.985766 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985773 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" Feb 20 08:37:17 crc kubenswrapper[5094]: E0220 08:37:17.985784 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" containerName="collect-profiles" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985789 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" containerName="collect-profiles" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985928 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="386a80fc-f69f-4bc8-bc43-8c3eba784c4e" containerName="registry-server" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.985939 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" containerName="collect-profiles" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.986435 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.996585 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-d88p7" Feb 20 08:37:17 crc kubenswrapper[5094]: I0220 08:37:17.997916 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.097759 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.097821 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.199057 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.199210 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.202901 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.202931 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3a4207eaa54a39bb320d6003c0fa348ef31a94bbf0305c6e782ec2791dde8b18/globalmount\"" pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.221818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.232027 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"mariadb-copy-data\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.307008 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 08:37:18 crc kubenswrapper[5094]: I0220 08:37:18.809917 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 08:37:19 crc kubenswrapper[5094]: I0220 08:37:19.134639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38c66beb-a97f-470c-8999-e15f5c4a9b60","Type":"ContainerStarted","Data":"0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0"} Feb 20 08:37:19 crc kubenswrapper[5094]: I0220 08:37:19.134685 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38c66beb-a97f-470c-8999-e15f5c4a9b60","Type":"ContainerStarted","Data":"9eef613ade19e2ed2f7272eef05d7fc30f774f3cc73f115f6f763788afc1cc96"} Feb 20 08:37:19 crc kubenswrapper[5094]: I0220 08:37:19.148811 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=3.148794034 podStartE2EDuration="3.148794034s" podCreationTimestamp="2026-02-20 08:37:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:37:19.147037672 +0000 UTC m=+6654.019664383" watchObservedRunningTime="2026-02-20 08:37:19.148794034 +0000 UTC m=+6654.021420745" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.716140 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.717727 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.733416 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.856689 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.856766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.856823 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.958004 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.958070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.958494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.959590 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.960015 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:20 crc kubenswrapper[5094]: I0220 08:37:20.978036 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") pod \"redhat-marketplace-rdh9h\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:21 crc kubenswrapper[5094]: I0220 08:37:21.039743 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:21 crc kubenswrapper[5094]: I0220 08:37:21.499075 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.185443 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f9a7688-af6f-4953-b440-409492c949c9" containerID="c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513" exitCode=0 Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.185569 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerDied","Data":"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513"} Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.185740 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerStarted","Data":"8c642e0fc4a13dfb17f0c0c42eaaf1f018f91fd34f83c2194b14e4acf8f25a6f"} Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.187821 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.772164 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.773272 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.780510 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.892065 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") pod \"mariadb-client\" (UID: \"d771cf42-84a5-4783-8680-2ffad753c57e\") " pod="openstack/mariadb-client" Feb 20 08:37:22 crc kubenswrapper[5094]: I0220 08:37:22.993623 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") pod \"mariadb-client\" (UID: \"d771cf42-84a5-4783-8680-2ffad753c57e\") " pod="openstack/mariadb-client" Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.017575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") pod \"mariadb-client\" (UID: \"d771cf42-84a5-4783-8680-2ffad753c57e\") " pod="openstack/mariadb-client" Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.165856 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.204001 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f9a7688-af6f-4953-b440-409492c949c9" containerID="2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06" exitCode=0 Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.204032 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerDied","Data":"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06"} Feb 20 08:37:23 crc kubenswrapper[5094]: I0220 08:37:23.592771 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.211759 5094 generic.go:334] "Generic (PLEG): container finished" podID="d771cf42-84a5-4783-8680-2ffad753c57e" containerID="cfdeb3a947002392415700b896a4c45e60680753af31e8fcda15dd6191a2f1dc" exitCode=0 Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.211796 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d771cf42-84a5-4783-8680-2ffad753c57e","Type":"ContainerDied","Data":"cfdeb3a947002392415700b896a4c45e60680753af31e8fcda15dd6191a2f1dc"} Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.212054 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"d771cf42-84a5-4783-8680-2ffad753c57e","Type":"ContainerStarted","Data":"470051e33cbb8af1d6882246b88c3dbe7592d48c358cb5b30ed369af587b5673"} Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.213849 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerStarted","Data":"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80"} Feb 20 08:37:24 crc kubenswrapper[5094]: I0220 08:37:24.241957 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rdh9h" podStartSLOduration=2.842336342 podStartE2EDuration="4.241939307s" podCreationTimestamp="2026-02-20 08:37:20 +0000 UTC" firstStartedPulling="2026-02-20 08:37:22.187600954 +0000 UTC m=+6657.060227665" lastFinishedPulling="2026-02-20 08:37:23.587203919 +0000 UTC m=+6658.459830630" observedRunningTime="2026-02-20 08:37:24.238005562 +0000 UTC m=+6659.110632283" watchObservedRunningTime="2026-02-20 08:37:24.241939307 +0000 UTC m=+6659.114566018" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.564032 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.589226 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_d771cf42-84a5-4783-8680-2ffad753c57e/mariadb-client/0.log" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.616292 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.621843 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.642496 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") pod \"d771cf42-84a5-4783-8680-2ffad753c57e\" (UID: \"d771cf42-84a5-4783-8680-2ffad753c57e\") " Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.648077 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv" (OuterVolumeSpecName: "kube-api-access-r72nv") pod "d771cf42-84a5-4783-8680-2ffad753c57e" (UID: "d771cf42-84a5-4783-8680-2ffad753c57e"). InnerVolumeSpecName "kube-api-access-r72nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.744672 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r72nv\" (UniqueName: \"kubernetes.io/projected/d771cf42-84a5-4783-8680-2ffad753c57e-kube-api-access-r72nv\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.794596 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:25 crc kubenswrapper[5094]: E0220 08:37:25.795251 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d771cf42-84a5-4783-8680-2ffad753c57e" containerName="mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.795270 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d771cf42-84a5-4783-8680-2ffad753c57e" containerName="mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.795579 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d771cf42-84a5-4783-8680-2ffad753c57e" containerName="mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.796551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.803031 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.846980 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") pod \"mariadb-client\" (UID: \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\") " pod="openstack/mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.855346 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d771cf42-84a5-4783-8680-2ffad753c57e" path="/var/lib/kubelet/pods/d771cf42-84a5-4783-8680-2ffad753c57e/volumes" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.948408 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") pod \"mariadb-client\" (UID: \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\") " pod="openstack/mariadb-client" Feb 20 08:37:25 crc kubenswrapper[5094]: I0220 08:37:25.969792 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") pod \"mariadb-client\" (UID: \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\") " pod="openstack/mariadb-client" Feb 20 08:37:26 crc kubenswrapper[5094]: I0220 08:37:26.119124 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:26 crc kubenswrapper[5094]: I0220 08:37:26.258670 5094 scope.go:117] "RemoveContainer" containerID="cfdeb3a947002392415700b896a4c45e60680753af31e8fcda15dd6191a2f1dc" Feb 20 08:37:26 crc kubenswrapper[5094]: I0220 08:37:26.258742 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:26 crc kubenswrapper[5094]: I0220 08:37:26.606652 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:26 crc kubenswrapper[5094]: W0220 08:37:26.612981 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a2ead87_5f06_45fa_aa5d_347ff16f517e.slice/crio-1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4 WatchSource:0}: Error finding container 1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4: Status 404 returned error can't find the container with id 1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4 Feb 20 08:37:27 crc kubenswrapper[5094]: I0220 08:37:27.269382 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" containerID="51c5d24049c628fe72ec29c7d6aad6b1b26637f9d1812b5f27f768d7d83239ed" exitCode=0 Feb 20 08:37:27 crc kubenswrapper[5094]: I0220 08:37:27.269446 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5a2ead87-5f06-45fa-aa5d-347ff16f517e","Type":"ContainerDied","Data":"51c5d24049c628fe72ec29c7d6aad6b1b26637f9d1812b5f27f768d7d83239ed"} Feb 20 08:37:27 crc kubenswrapper[5094]: I0220 08:37:27.269839 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5a2ead87-5f06-45fa-aa5d-347ff16f517e","Type":"ContainerStarted","Data":"1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4"} Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.648320 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.672100 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_5a2ead87-5f06-45fa-aa5d-347ff16f517e/mariadb-client/0.log" Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.696110 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") pod \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\" (UID: \"5a2ead87-5f06-45fa-aa5d-347ff16f517e\") " Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.701723 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl" (OuterVolumeSpecName: "kube-api-access-fp2gl") pod "5a2ead87-5f06-45fa-aa5d-347ff16f517e" (UID: "5a2ead87-5f06-45fa-aa5d-347ff16f517e"). InnerVolumeSpecName "kube-api-access-fp2gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.702428 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.713592 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Feb 20 08:37:28 crc kubenswrapper[5094]: I0220 08:37:28.797624 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp2gl\" (UniqueName: \"kubernetes.io/projected/5a2ead87-5f06-45fa-aa5d-347ff16f517e-kube-api-access-fp2gl\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:29 crc kubenswrapper[5094]: I0220 08:37:29.292734 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fd0fe7ec6d9fbcb333d4c4918bc170668fef60f0fb6bdf09650d32851d2ffc4" Feb 20 08:37:29 crc kubenswrapper[5094]: I0220 08:37:29.292830 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Feb 20 08:37:29 crc kubenswrapper[5094]: I0220 08:37:29.850560 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" path="/var/lib/kubelet/pods/5a2ead87-5f06-45fa-aa5d-347ff16f517e/volumes" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.041175 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.041285 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.089025 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.366773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:31 crc kubenswrapper[5094]: I0220 08:37:31.415103 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:33 crc kubenswrapper[5094]: I0220 08:37:33.327656 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rdh9h" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="registry-server" containerID="cri-o://455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" gracePeriod=2 Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.028143 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.188652 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") pod \"6f9a7688-af6f-4953-b440-409492c949c9\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.188783 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") pod \"6f9a7688-af6f-4953-b440-409492c949c9\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.188945 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") pod \"6f9a7688-af6f-4953-b440-409492c949c9\" (UID: \"6f9a7688-af6f-4953-b440-409492c949c9\") " Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.190499 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities" (OuterVolumeSpecName: "utilities") pod "6f9a7688-af6f-4953-b440-409492c949c9" (UID: "6f9a7688-af6f-4953-b440-409492c949c9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.193907 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv" (OuterVolumeSpecName: "kube-api-access-cbjtv") pod "6f9a7688-af6f-4953-b440-409492c949c9" (UID: "6f9a7688-af6f-4953-b440-409492c949c9"). InnerVolumeSpecName "kube-api-access-cbjtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.216513 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f9a7688-af6f-4953-b440-409492c949c9" (UID: "6f9a7688-af6f-4953-b440-409492c949c9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.290536 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.290574 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbjtv\" (UniqueName: \"kubernetes.io/projected/6f9a7688-af6f-4953-b440-409492c949c9-kube-api-access-cbjtv\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.290585 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f9a7688-af6f-4953-b440-409492c949c9-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340014 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f9a7688-af6f-4953-b440-409492c949c9" containerID="455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" exitCode=0 Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerDied","Data":"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80"} Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340129 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdh9h" event={"ID":"6f9a7688-af6f-4953-b440-409492c949c9","Type":"ContainerDied","Data":"8c642e0fc4a13dfb17f0c0c42eaaf1f018f91fd34f83c2194b14e4acf8f25a6f"} Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340133 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdh9h" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.340157 5094 scope.go:117] "RemoveContainer" containerID="455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.367959 5094 scope.go:117] "RemoveContainer" containerID="2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.395002 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.402057 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdh9h"] Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.411509 5094 scope.go:117] "RemoveContainer" containerID="c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.434946 5094 scope.go:117] "RemoveContainer" containerID="455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" Feb 20 08:37:34 crc kubenswrapper[5094]: E0220 08:37:34.435407 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80\": container with ID starting with 455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80 not found: ID does not exist" containerID="455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.435437 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80"} err="failed to get container status \"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80\": rpc error: code = NotFound desc = could not find container \"455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80\": container with ID starting with 455520736b3d244bd242142a5565e950ec689596fd0a57c14bf0d96df2437b80 not found: ID does not exist" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.435457 5094 scope.go:117] "RemoveContainer" containerID="2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06" Feb 20 08:37:34 crc kubenswrapper[5094]: E0220 08:37:34.435798 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06\": container with ID starting with 2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06 not found: ID does not exist" containerID="2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.435834 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06"} err="failed to get container status \"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06\": rpc error: code = NotFound desc = could not find container \"2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06\": container with ID starting with 2cc82cc2beadfd24972780a90d31e67eb714f9ad234f7641910d3746337aaa06 not found: ID does not exist" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.435852 5094 scope.go:117] "RemoveContainer" containerID="c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513" Feb 20 08:37:34 crc kubenswrapper[5094]: E0220 08:37:34.436176 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513\": container with ID starting with c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513 not found: ID does not exist" containerID="c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513" Feb 20 08:37:34 crc kubenswrapper[5094]: I0220 08:37:34.436236 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513"} err="failed to get container status \"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513\": rpc error: code = NotFound desc = could not find container \"c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513\": container with ID starting with c1a008da31d18f9721b30cdc04cddededd4183c6a72ed832366210c7f6a42513 not found: ID does not exist" Feb 20 08:37:35 crc kubenswrapper[5094]: I0220 08:37:35.851229 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9a7688-af6f-4953-b440-409492c949c9" path="/var/lib/kubelet/pods/6f9a7688-af6f-4953-b440-409492c949c9/volumes" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.740933 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:44 crc kubenswrapper[5094]: E0220 08:37:44.741815 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" containerName="mariadb-client" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.741874 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" containerName="mariadb-client" Feb 20 08:37:44 crc kubenswrapper[5094]: E0220 08:37:44.741909 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="extract-utilities" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.741919 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="extract-utilities" Feb 20 08:37:44 crc kubenswrapper[5094]: E0220 08:37:44.741938 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="registry-server" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.741947 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="registry-server" Feb 20 08:37:44 crc kubenswrapper[5094]: E0220 08:37:44.741965 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="extract-content" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.741971 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="extract-content" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.742202 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9a7688-af6f-4953-b440-409492c949c9" containerName="registry-server" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.742221 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a2ead87-5f06-45fa-aa5d-347ff16f517e" containerName="mariadb-client" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.743570 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.784792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.784922 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.784970 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.785816 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.886358 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.886503 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.886716 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.887073 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.887247 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:44 crc kubenswrapper[5094]: I0220 08:37:44.907115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") pod \"community-operators-vs8s6\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:45 crc kubenswrapper[5094]: I0220 08:37:45.065475 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:45 crc kubenswrapper[5094]: I0220 08:37:45.523622 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:46 crc kubenswrapper[5094]: I0220 08:37:46.452445 5094 generic.go:334] "Generic (PLEG): container finished" podID="bf546d1c-5402-468b-a510-df011545b4bf" containerID="f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02" exitCode=0 Feb 20 08:37:46 crc kubenswrapper[5094]: I0220 08:37:46.452545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerDied","Data":"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02"} Feb 20 08:37:46 crc kubenswrapper[5094]: I0220 08:37:46.452890 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerStarted","Data":"b2e4678a0b8ece41a89baa1faf5b87ba135db8df0ae1a2369768bfdb8c3b0a46"} Feb 20 08:37:47 crc kubenswrapper[5094]: I0220 08:37:47.462113 5094 generic.go:334] "Generic (PLEG): container finished" podID="bf546d1c-5402-468b-a510-df011545b4bf" containerID="b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b" exitCode=0 Feb 20 08:37:47 crc kubenswrapper[5094]: I0220 08:37:47.462237 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerDied","Data":"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b"} Feb 20 08:37:48 crc kubenswrapper[5094]: I0220 08:37:48.472499 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerStarted","Data":"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd"} Feb 20 08:37:48 crc kubenswrapper[5094]: I0220 08:37:48.501093 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vs8s6" podStartSLOduration=3.11129748 podStartE2EDuration="4.501066588s" podCreationTimestamp="2026-02-20 08:37:44 +0000 UTC" firstStartedPulling="2026-02-20 08:37:46.454624335 +0000 UTC m=+6681.327251046" lastFinishedPulling="2026-02-20 08:37:47.844393453 +0000 UTC m=+6682.717020154" observedRunningTime="2026-02-20 08:37:48.491619721 +0000 UTC m=+6683.364246472" watchObservedRunningTime="2026-02-20 08:37:48.501066588 +0000 UTC m=+6683.373693349" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.067115 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.067810 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.116337 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.573312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:55 crc kubenswrapper[5094]: I0220 08:37:55.620376 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.541627 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vs8s6" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="registry-server" containerID="cri-o://4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" gracePeriod=2 Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.986999 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.996614 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") pod \"bf546d1c-5402-468b-a510-df011545b4bf\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.996724 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") pod \"bf546d1c-5402-468b-a510-df011545b4bf\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.996789 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") pod \"bf546d1c-5402-468b-a510-df011545b4bf\" (UID: \"bf546d1c-5402-468b-a510-df011545b4bf\") " Feb 20 08:37:57 crc kubenswrapper[5094]: I0220 08:37:57.997527 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities" (OuterVolumeSpecName: "utilities") pod "bf546d1c-5402-468b-a510-df011545b4bf" (UID: "bf546d1c-5402-468b-a510-df011545b4bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.002575 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j" (OuterVolumeSpecName: "kube-api-access-jcn9j") pod "bf546d1c-5402-468b-a510-df011545b4bf" (UID: "bf546d1c-5402-468b-a510-df011545b4bf"). InnerVolumeSpecName "kube-api-access-jcn9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.098851 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.098891 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcn9j\" (UniqueName: \"kubernetes.io/projected/bf546d1c-5402-468b-a510-df011545b4bf-kube-api-access-jcn9j\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.371191 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf546d1c-5402-468b-a510-df011545b4bf" (UID: "bf546d1c-5402-468b-a510-df011545b4bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.401573 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf546d1c-5402-468b-a510-df011545b4bf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.549721 5094 generic.go:334] "Generic (PLEG): container finished" podID="bf546d1c-5402-468b-a510-df011545b4bf" containerID="4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" exitCode=0 Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.549814 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vs8s6" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.549813 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerDied","Data":"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd"} Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.550131 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vs8s6" event={"ID":"bf546d1c-5402-468b-a510-df011545b4bf","Type":"ContainerDied","Data":"b2e4678a0b8ece41a89baa1faf5b87ba135db8df0ae1a2369768bfdb8c3b0a46"} Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.550156 5094 scope.go:117] "RemoveContainer" containerID="4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.568827 5094 scope.go:117] "RemoveContainer" containerID="b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.586384 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.594085 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vs8s6"] Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.605630 5094 scope.go:117] "RemoveContainer" containerID="f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.627437 5094 scope.go:117] "RemoveContainer" containerID="4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" Feb 20 08:37:58 crc kubenswrapper[5094]: E0220 08:37:58.628127 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd\": container with ID starting with 4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd not found: ID does not exist" containerID="4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628161 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd"} err="failed to get container status \"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd\": rpc error: code = NotFound desc = could not find container \"4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd\": container with ID starting with 4b2c720f16849b4a5e5ba3ed04da3530f344c6b52f3458788d8c97bf92c15bcd not found: ID does not exist" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628184 5094 scope.go:117] "RemoveContainer" containerID="b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b" Feb 20 08:37:58 crc kubenswrapper[5094]: E0220 08:37:58.628492 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b\": container with ID starting with b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b not found: ID does not exist" containerID="b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628513 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b"} err="failed to get container status \"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b\": rpc error: code = NotFound desc = could not find container \"b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b\": container with ID starting with b31d2dc45159f42d33732eb6b6ef11de26eb07de6167955691efff5af1506f0b not found: ID does not exist" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628527 5094 scope.go:117] "RemoveContainer" containerID="f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02" Feb 20 08:37:58 crc kubenswrapper[5094]: E0220 08:37:58.628899 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02\": container with ID starting with f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02 not found: ID does not exist" containerID="f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02" Feb 20 08:37:58 crc kubenswrapper[5094]: I0220 08:37:58.628926 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02"} err="failed to get container status \"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02\": rpc error: code = NotFound desc = could not find container \"f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02\": container with ID starting with f20fba63f7ce465e08719424a8bc32bb153d0d6e565a93b992825ed7e94c9e02 not found: ID does not exist" Feb 20 08:37:59 crc kubenswrapper[5094]: I0220 08:37:59.848072 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf546d1c-5402-468b-a510-df011545b4bf" path="/var/lib/kubelet/pods/bf546d1c-5402-468b-a510-df011545b4bf/volumes" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.599930 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 08:38:00 crc kubenswrapper[5094]: E0220 08:38:00.600307 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="extract-utilities" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.600334 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="extract-utilities" Feb 20 08:38:00 crc kubenswrapper[5094]: E0220 08:38:00.600355 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="registry-server" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.600363 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="registry-server" Feb 20 08:38:00 crc kubenswrapper[5094]: E0220 08:38:00.600381 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="extract-content" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.600389 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="extract-content" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.600563 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf546d1c-5402-468b-a510-df011545b4bf" containerName="registry-server" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.601431 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.603465 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-cccp9" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.603889 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.611182 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.613961 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.652625 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.654821 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.663171 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.671915 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.674045 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.681053 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.733891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734549 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-config\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734688 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rp6m\" (UniqueName: \"kubernetes.io/projected/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-kube-api-access-7rp6m\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734802 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.734867 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.797835 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.800496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.808131 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-khxrk" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.809828 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.813432 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.815555 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.817635 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.823398 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.824896 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.835945 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836006 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-config\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7n6t\" (UniqueName: \"kubernetes.io/projected/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-kube-api-access-w7n6t\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836092 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-kube-api-access-gvjj7\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836135 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-config\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836166 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836189 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-config\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836409 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rp6m\" (UniqueName: \"kubernetes.io/projected/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-kube-api-access-7rp6m\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836435 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836464 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836507 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836533 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836575 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836598 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836623 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836654 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-086d02ff-33e5-4489-acc4-387156439c87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-086d02ff-33e5-4489-acc4-387156439c87\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.836681 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.838002 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.840032 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.840437 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-config\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.844596 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.844641 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/af7a81776acc7ca130746af725e9a7d819304a2a8d3e4ea5c067c8428d995108/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.845931 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.846389 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.860254 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.863690 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rp6m\" (UniqueName: \"kubernetes.io/projected/464f9fe3-85bf-4e78-adc3-3feedbaf1dac-kube-api-access-7rp6m\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.869529 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.911589 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a68ddb08-cf6f-40ba-8b6c-c58d8c550e21\") pod \"ovsdbserver-nb-0\" (UID: \"464f9fe3-85bf-4e78-adc3-3feedbaf1dac\") " pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.932138 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940381 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940495 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-086d02ff-33e5-4489-acc4-387156439c87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-086d02ff-33e5-4489-acc4-387156439c87\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940521 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940596 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e6d0be3-167e-49e9-8450-a563f9115817-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940650 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940730 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6jvv\" (UniqueName: \"kubernetes.io/projected/9e6d0be3-167e-49e9-8450-a563f9115817-kube-api-access-v6jvv\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-config\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7n6t\" (UniqueName: \"kubernetes.io/projected/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-kube-api-access-w7n6t\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6d0be3-167e-49e9-8450-a563f9115817-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940860 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-config\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.940925 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-kube-api-access-gvjj7\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941019 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa57141-e76b-43a8-b363-2a1c7129d7c2-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941057 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941107 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-config\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941150 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941176 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4755z\" (UniqueName: \"kubernetes.io/projected/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-kube-api-access-4755z\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941213 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941269 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg4dh\" (UniqueName: \"kubernetes.io/projected/daa57141-e76b-43a8-b363-2a1c7129d7c2-kube-api-access-wg4dh\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941334 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daa57141-e76b-43a8-b363-2a1c7129d7c2-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941363 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941391 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-config\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941414 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941477 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.941692 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.944972 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-config\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.946915 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.947773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.948598 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.949135 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.949393 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.949424 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-086d02ff-33e5-4489-acc4-387156439c87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-086d02ff-33e5-4489-acc4-387156439c87\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4f917b20bcaaf63d298ed333d7ec9bccd5f615564cff30af71dc3e2e9860eebf/globalmount\"" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.950070 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.950099 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3e7eacfc8f768e9fa5f5a033cdbc14ac604c8bf71d6df88886a02074aa564c63/globalmount\"" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.951468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-config\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.952600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.961214 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.970369 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7n6t\" (UniqueName: \"kubernetes.io/projected/79dfde5a-85a9-437f-979d-1fdb99a1bb5f-kube-api-access-w7n6t\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.972823 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvjj7\" (UniqueName: \"kubernetes.io/projected/5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf-kube-api-access-gvjj7\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:00 crc kubenswrapper[5094]: I0220 08:38:00.988023 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-8c8db6a8-4d9e-4fff-b7d7-1b00dbd62a06\") pod \"ovsdbserver-nb-1\" (UID: \"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf\") " pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:00.992496 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-086d02ff-33e5-4489-acc4-387156439c87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-086d02ff-33e5-4489-acc4-387156439c87\") pod \"ovsdbserver-nb-2\" (UID: \"79dfde5a-85a9-437f-979d-1fdb99a1bb5f\") " pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043407 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg4dh\" (UniqueName: \"kubernetes.io/projected/daa57141-e76b-43a8-b363-2a1c7129d7c2-kube-api-access-wg4dh\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043765 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daa57141-e76b-43a8-b363-2a1c7129d7c2-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043811 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-config\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043837 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043860 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.043879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044550 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044882 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e6d0be3-167e-49e9-8450-a563f9115817-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044649 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/daa57141-e76b-43a8-b363-2a1c7129d7c2-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044954 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.044968 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045226 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-config\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045290 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045518 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9e6d0be3-167e-49e9-8450-a563f9115817-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045830 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6jvv\" (UniqueName: \"kubernetes.io/projected/9e6d0be3-167e-49e9-8450-a563f9115817-kube-api-access-v6jvv\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045915 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6d0be3-167e-49e9-8450-a563f9115817-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.045936 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046189 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-config\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046256 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-config\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046296 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa57141-e76b-43a8-b363-2a1c7129d7c2-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046313 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046715 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/daa57141-e76b-43a8-b363-2a1c7129d7c2-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.046840 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4755z\" (UniqueName: \"kubernetes.io/projected/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-kube-api-access-4755z\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047265 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047379 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e6d0be3-167e-49e9-8450-a563f9115817-config\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047679 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047712 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3de976b4516acfae8f0ae7592c7947fd04a70506f0f82f5f85f87b0700dab6c1/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047723 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047742 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/01a95fc33ca55a11f0de736abe42c39664c673b1158062cf6e2d13e8309e9d6d/globalmount\"" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047856 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.047902 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/56f10ad7da66177b62cc526e0d3d231fa0fafed9b03ca85a0c04fc0671d7fe8b/globalmount\"" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.049882 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.050399 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e6d0be3-167e-49e9-8450-a563f9115817-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.056597 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/daa57141-e76b-43a8-b363-2a1c7129d7c2-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.123399 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg4dh\" (UniqueName: \"kubernetes.io/projected/daa57141-e76b-43a8-b363-2a1c7129d7c2-kube-api-access-wg4dh\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.128820 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6jvv\" (UniqueName: \"kubernetes.io/projected/9e6d0be3-167e-49e9-8450-a563f9115817-kube-api-access-v6jvv\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.133913 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4755z\" (UniqueName: \"kubernetes.io/projected/6ff74bc5-95bf-47fd-969e-cecbf1317e5d-kube-api-access-4755z\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.142453 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a53cb370-1d59-4b93-af68-e80a25bccf14\") pod \"ovsdbserver-sb-2\" (UID: \"9e6d0be3-167e-49e9-8450-a563f9115817\") " pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.146817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d2ec44bc-99d8-46fe-bce0-e66607ad84f0\") pod \"ovsdbserver-sb-0\" (UID: \"6ff74bc5-95bf-47fd-969e-cecbf1317e5d\") " pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.148356 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-74762bbd-7d2d-49b1-96c4-758e964293fc\") pod \"ovsdbserver-sb-1\" (UID: \"daa57141-e76b-43a8-b363-2a1c7129d7c2\") " pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.223283 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.230223 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.274434 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.288983 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.419067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.651108 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.709430 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.812459 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Feb 20 08:38:01 crc kubenswrapper[5094]: W0220 08:38:01.815381 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddaa57141_e76b_43a8_b363_2a1c7129d7c2.slice/crio-f400088e97c917837c1fd23b0a74f1e67f33f87d312992ecc52236a504ce04ae WatchSource:0}: Error finding container f400088e97c917837c1fd23b0a74f1e67f33f87d312992ecc52236a504ce04ae: Status 404 returned error can't find the container with id f400088e97c917837c1fd23b0a74f1e67f33f87d312992ecc52236a504ce04ae Feb 20 08:38:01 crc kubenswrapper[5094]: I0220 08:38:01.940722 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Feb 20 08:38:01 crc kubenswrapper[5094]: W0220 08:38:01.945031 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a5fd3fa_b5c3_4e02_a9e6_26be7e747baf.slice/crio-06dfae5e8e75efed8d1bb35bd399b9e54698c1efe8466c5efcdcbf6e7aa38df7 WatchSource:0}: Error finding container 06dfae5e8e75efed8d1bb35bd399b9e54698c1efe8466c5efcdcbf6e7aa38df7: Status 404 returned error can't find the container with id 06dfae5e8e75efed8d1bb35bd399b9e54698c1efe8466c5efcdcbf6e7aa38df7 Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.104993 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.595972 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf","Type":"ContainerStarted","Data":"06dfae5e8e75efed8d1bb35bd399b9e54698c1efe8466c5efcdcbf6e7aa38df7"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.597673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6ff74bc5-95bf-47fd-969e-cecbf1317e5d","Type":"ContainerStarted","Data":"f01f91db9ecefd5a772f69460286181a4eb87ea9321f41fcf8762ab79ec8209e"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.598889 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"daa57141-e76b-43a8-b363-2a1c7129d7c2","Type":"ContainerStarted","Data":"f400088e97c917837c1fd23b0a74f1e67f33f87d312992ecc52236a504ce04ae"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.600130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9e6d0be3-167e-49e9-8450-a563f9115817","Type":"ContainerStarted","Data":"07ac9677cab8c7dc7d9f8fb253454cf9f1ccb83ba8bee4cca6e7b4cc7a2ab0ee"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.601557 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"464f9fe3-85bf-4e78-adc3-3feedbaf1dac","Type":"ContainerStarted","Data":"f7f1f7e993447666b52826c9f29a6032a140b4d9da2ada87940301108b4b71f4"} Feb 20 08:38:02 crc kubenswrapper[5094]: I0220 08:38:02.671443 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Feb 20 08:38:03 crc kubenswrapper[5094]: I0220 08:38:03.616985 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"79dfde5a-85a9-437f-979d-1fdb99a1bb5f","Type":"ContainerStarted","Data":"41c8000ed5958ae552e5896097b5d15e857e74ee85c49d153c5cf5700209b743"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.643833 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"79dfde5a-85a9-437f-979d-1fdb99a1bb5f","Type":"ContainerStarted","Data":"8d75a82b08cf0e4ed65386619c69af8dad6fc6ac01d03ab70938690b266fd92f"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.645662 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"464f9fe3-85bf-4e78-adc3-3feedbaf1dac","Type":"ContainerStarted","Data":"23c515e4903b122b59853efaa4b81a82e3af81ae56b30822173437a630639389"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.645682 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"464f9fe3-85bf-4e78-adc3-3feedbaf1dac","Type":"ContainerStarted","Data":"cad877db9087f42fae5beb889ae0251566dcf2cb5740cd06889aa6e148238bb6"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.648058 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf","Type":"ContainerStarted","Data":"bba0a9f73377e804cd503c5cafe076f419799ed3bb1954426a828acf7ef836af"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.648098 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf","Type":"ContainerStarted","Data":"d4bdf6c215bd679fb0bb92e8eac97b0a597ad3bbb7f3ca15c939b5140aacfab6"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.657544 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6ff74bc5-95bf-47fd-969e-cecbf1317e5d","Type":"ContainerStarted","Data":"e5442ddc216d0fffaf41f19267974c16eb734de160956a981ffde266c3dfb6ff"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.657595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"6ff74bc5-95bf-47fd-969e-cecbf1317e5d","Type":"ContainerStarted","Data":"894019acdbeb1b3c8070ec89b10e9a16d33b0aec1d0ff9f59d4b97c89bf52360"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.659612 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"daa57141-e76b-43a8-b363-2a1c7129d7c2","Type":"ContainerStarted","Data":"6dd3c4055c326a927f1cf9ca2c3c9c971d515a2985402433d11bad89c3adc4c0"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.659644 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"daa57141-e76b-43a8-b363-2a1c7129d7c2","Type":"ContainerStarted","Data":"6c639df003d8c9072fcb16082e4d25ba1ad201993e8cb56f59b7dda97b6a8df5"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.665188 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.5034545489999998 podStartE2EDuration="7.665173924s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:01.656202148 +0000 UTC m=+6696.528828859" lastFinishedPulling="2026-02-20 08:38:05.817921523 +0000 UTC m=+6700.690548234" observedRunningTime="2026-02-20 08:38:06.664857347 +0000 UTC m=+6701.537484058" watchObservedRunningTime="2026-02-20 08:38:06.665173924 +0000 UTC m=+6701.537800635" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.665244 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9e6d0be3-167e-49e9-8450-a563f9115817","Type":"ContainerStarted","Data":"4c13e72dcc45e8131fc1636a705a643d6f7623e809f34bdf7f0898d98c9ded45"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.665294 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"9e6d0be3-167e-49e9-8450-a563f9115817","Type":"ContainerStarted","Data":"c52b3f7de314c2fed27b6a3ca93e11553d22d024cb7cff3ad9b2ba6cb7843fe6"} Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.694151 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=3.810295625 podStartE2EDuration="7.694134542s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:01.948989205 +0000 UTC m=+6696.821615916" lastFinishedPulling="2026-02-20 08:38:05.832828122 +0000 UTC m=+6700.705454833" observedRunningTime="2026-02-20 08:38:06.690411642 +0000 UTC m=+6701.563038383" watchObservedRunningTime="2026-02-20 08:38:06.694134542 +0000 UTC m=+6701.566761253" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.714116 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.715736389 podStartE2EDuration="7.714067381s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:01.819933129 +0000 UTC m=+6696.692559840" lastFinishedPulling="2026-02-20 08:38:05.818264121 +0000 UTC m=+6700.690890832" observedRunningTime="2026-02-20 08:38:06.712541554 +0000 UTC m=+6701.585168265" watchObservedRunningTime="2026-02-20 08:38:06.714067381 +0000 UTC m=+6701.586694092" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.729827 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.029957792 podStartE2EDuration="7.72981176s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:02.119451578 +0000 UTC m=+6696.992078289" lastFinishedPulling="2026-02-20 08:38:05.819305546 +0000 UTC m=+6700.691932257" observedRunningTime="2026-02-20 08:38:06.726105131 +0000 UTC m=+6701.598731842" watchObservedRunningTime="2026-02-20 08:38:06.72981176 +0000 UTC m=+6701.602438471" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.743363 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.641114193 podStartE2EDuration="7.743347166s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:01.715488095 +0000 UTC m=+6696.588114806" lastFinishedPulling="2026-02-20 08:38:05.817721068 +0000 UTC m=+6700.690347779" observedRunningTime="2026-02-20 08:38:06.743031338 +0000 UTC m=+6701.615658059" watchObservedRunningTime="2026-02-20 08:38:06.743347166 +0000 UTC m=+6701.615973877" Feb 20 08:38:06 crc kubenswrapper[5094]: I0220 08:38:06.933213 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.224811 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.230956 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.275312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.419552 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.674266 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"79dfde5a-85a9-437f-979d-1fdb99a1bb5f","Type":"ContainerStarted","Data":"f7d4b556c6eff4b7976324b90fe5d2082b5e471da261cf1356c01b752ea25135"} Feb 20 08:38:07 crc kubenswrapper[5094]: I0220 08:38:07.695308 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=5.194745235 podStartE2EDuration="8.695289317s" podCreationTimestamp="2026-02-20 08:37:59 +0000 UTC" firstStartedPulling="2026-02-20 08:38:02.691238509 +0000 UTC m=+6697.563865220" lastFinishedPulling="2026-02-20 08:38:06.191782591 +0000 UTC m=+6701.064409302" observedRunningTime="2026-02-20 08:38:07.689396036 +0000 UTC m=+6702.562022747" watchObservedRunningTime="2026-02-20 08:38:07.695289317 +0000 UTC m=+6702.567916028" Feb 20 08:38:09 crc kubenswrapper[5094]: I0220 08:38:09.987390 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:09 crc kubenswrapper[5094]: I0220 08:38:09.987726 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.268253 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.268961 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.283070 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.283352 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.289693 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.316752 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.317091 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.335624 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.457584 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.458267 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:10 crc kubenswrapper[5094]: I0220 08:38:10.701466 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.293695 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.293951 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.310803 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.351079 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.454017 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.512806 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.514164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.517309 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.528472 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.644422 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:11 crc kubenswrapper[5094]: E0220 08:38:11.645237 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-79prf ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" podUID="bc60d174-9065-440e-b292-7b9646fdc03c" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.645366 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.645444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.645516 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.645579 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.671968 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.673151 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.674891 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.686215 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.708775 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.720490 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.746735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.746836 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.746882 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.746924 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.747646 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.747648 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.747975 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.763415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") pod \"dnsmasq-dns-64f8ffdb7f-bsqkd\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.847865 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") pod \"bc60d174-9065-440e-b292-7b9646fdc03c\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.847921 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") pod \"bc60d174-9065-440e-b292-7b9646fdc03c\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.847983 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") pod \"bc60d174-9065-440e-b292-7b9646fdc03c\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848142 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") pod \"bc60d174-9065-440e-b292-7b9646fdc03c\" (UID: \"bc60d174-9065-440e-b292-7b9646fdc03c\") " Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848338 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config" (OuterVolumeSpecName: "config") pod "bc60d174-9065-440e-b292-7b9646fdc03c" (UID: "bc60d174-9065-440e-b292-7b9646fdc03c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848598 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc60d174-9065-440e-b292-7b9646fdc03c" (UID: "bc60d174-9065-440e-b292-7b9646fdc03c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.848999 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849206 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849281 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849661 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849693 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.849735 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc60d174-9065-440e-b292-7b9646fdc03c" (UID: "bc60d174-9065-440e-b292-7b9646fdc03c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.852432 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf" (OuterVolumeSpecName: "kube-api-access-79prf") pod "bc60d174-9065-440e-b292-7b9646fdc03c" (UID: "bc60d174-9065-440e-b292-7b9646fdc03c"). InnerVolumeSpecName "kube-api-access-79prf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.950891 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.950986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951064 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951445 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79prf\" (UniqueName: \"kubernetes.io/projected/bc60d174-9065-440e-b292-7b9646fdc03c-kube-api-access-79prf\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.951464 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc60d174-9065-440e-b292-7b9646fdc03c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.952158 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.952233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.952357 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.952415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.968970 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") pod \"dnsmasq-dns-68dcfc97c7-fzntw\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:11 crc kubenswrapper[5094]: I0220 08:38:11.990523 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.475050 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:38:12 crc kubenswrapper[5094]: W0220 08:38:12.477755 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59402637_ea41_4c78_a455_361d55c5422a.slice/crio-cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b WatchSource:0}: Error finding container cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b: Status 404 returned error can't find the container with id cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.720039 5094 generic.go:334] "Generic (PLEG): container finished" podID="59402637-ea41-4c78-a455-361d55c5422a" containerID="63ebc0843e1f18b14a59e973034594be94e63f23e4b14fd38a292a888d5971cd" exitCode=0 Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.720206 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerDied","Data":"63ebc0843e1f18b14a59e973034594be94e63f23e4b14fd38a292a888d5971cd"} Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.720259 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f8ffdb7f-bsqkd" Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.720266 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerStarted","Data":"cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b"} Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.823909 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:12 crc kubenswrapper[5094]: I0220 08:38:12.824362 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64f8ffdb7f-bsqkd"] Feb 20 08:38:13 crc kubenswrapper[5094]: I0220 08:38:13.733524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerStarted","Data":"c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43"} Feb 20 08:38:13 crc kubenswrapper[5094]: I0220 08:38:13.733793 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:13 crc kubenswrapper[5094]: I0220 08:38:13.771601 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" podStartSLOduration=2.771570752 podStartE2EDuration="2.771570752s" podCreationTimestamp="2026-02-20 08:38:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:38:13.767037833 +0000 UTC m=+6708.639664564" watchObservedRunningTime="2026-02-20 08:38:13.771570752 +0000 UTC m=+6708.644197503" Feb 20 08:38:13 crc kubenswrapper[5094]: I0220 08:38:13.854335 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc60d174-9065-440e-b292-7b9646fdc03c" path="/var/lib/kubelet/pods/bc60d174-9065-440e-b292-7b9646fdc03c/volumes" Feb 20 08:38:15 crc kubenswrapper[5094]: I0220 08:38:15.978348 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.214349 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.217009 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.219954 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.239616 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.321685 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.322255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.322347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.424096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.424181 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.424237 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.427384 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.427421 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5ddeed08d804d11f30f0c2d3797e874048b40f01e45c1c9db13016d536198515/globalmount\"" pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.432797 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.447219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.468961 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"ovn-copy-data\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " pod="openstack/ovn-copy-data" Feb 20 08:38:19 crc kubenswrapper[5094]: I0220 08:38:19.546518 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 08:38:20 crc kubenswrapper[5094]: I0220 08:38:20.119038 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 08:38:20 crc kubenswrapper[5094]: W0220 08:38:20.127700 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4111d2dd_641f_4113_8751_4151d435e934.slice/crio-12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4 WatchSource:0}: Error finding container 12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4: Status 404 returned error can't find the container with id 12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4 Feb 20 08:38:20 crc kubenswrapper[5094]: I0220 08:38:20.794938 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4111d2dd-641f-4113-8751-4151d435e934","Type":"ContainerStarted","Data":"0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e"} Feb 20 08:38:20 crc kubenswrapper[5094]: I0220 08:38:20.795009 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4111d2dd-641f-4113-8751-4151d435e934","Type":"ContainerStarted","Data":"12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4"} Feb 20 08:38:20 crc kubenswrapper[5094]: I0220 08:38:20.827170 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.5960703450000002 podStartE2EDuration="2.827144596s" podCreationTimestamp="2026-02-20 08:38:18 +0000 UTC" firstStartedPulling="2026-02-20 08:38:20.13385138 +0000 UTC m=+6715.006478131" lastFinishedPulling="2026-02-20 08:38:20.364925671 +0000 UTC m=+6715.237552382" observedRunningTime="2026-02-20 08:38:20.814574583 +0000 UTC m=+6715.687201324" watchObservedRunningTime="2026-02-20 08:38:20.827144596 +0000 UTC m=+6715.699771337" Feb 20 08:38:21 crc kubenswrapper[5094]: I0220 08:38:21.991987 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.056783 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.057057 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="dnsmasq-dns" containerID="cri-o://a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" gracePeriod=10 Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.550116 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.587413 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") pod \"6f4c545b-01fc-4e08-994c-7d24a10a963e\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.587688 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") pod \"6f4c545b-01fc-4e08-994c-7d24a10a963e\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.587962 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") pod \"6f4c545b-01fc-4e08-994c-7d24a10a963e\" (UID: \"6f4c545b-01fc-4e08-994c-7d24a10a963e\") " Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.606989 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2" (OuterVolumeSpecName: "kube-api-access-9shd2") pod "6f4c545b-01fc-4e08-994c-7d24a10a963e" (UID: "6f4c545b-01fc-4e08-994c-7d24a10a963e"). InnerVolumeSpecName "kube-api-access-9shd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.627943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6f4c545b-01fc-4e08-994c-7d24a10a963e" (UID: "6f4c545b-01fc-4e08-994c-7d24a10a963e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.629884 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config" (OuterVolumeSpecName: "config") pod "6f4c545b-01fc-4e08-994c-7d24a10a963e" (UID: "6f4c545b-01fc-4e08-994c-7d24a10a963e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.690217 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9shd2\" (UniqueName: \"kubernetes.io/projected/6f4c545b-01fc-4e08-994c-7d24a10a963e-kube-api-access-9shd2\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.690412 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.690469 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f4c545b-01fc-4e08-994c-7d24a10a963e-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.811788 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerID="a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" exitCode=0 Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.811849 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.811845 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerDied","Data":"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3"} Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.812276 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c98fc4f89-fwwbb" event={"ID":"6f4c545b-01fc-4e08-994c-7d24a10a963e","Type":"ContainerDied","Data":"980a56a2f88743ff53bcdb60a139ba0c40f893ee3421dbdc2711a580ef52fcca"} Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.812297 5094 scope.go:117] "RemoveContainer" containerID="a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.830116 5094 scope.go:117] "RemoveContainer" containerID="6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.841861 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.847980 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c98fc4f89-fwwbb"] Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.850286 5094 scope.go:117] "RemoveContainer" containerID="a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" Feb 20 08:38:22 crc kubenswrapper[5094]: E0220 08:38:22.850747 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3\": container with ID starting with a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3 not found: ID does not exist" containerID="a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.850864 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3"} err="failed to get container status \"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3\": rpc error: code = NotFound desc = could not find container \"a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3\": container with ID starting with a7aa04cfead057bdf297d9b737afb89cfa0c54a45dbf6dfc5c84ca574ec804e3 not found: ID does not exist" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.850959 5094 scope.go:117] "RemoveContainer" containerID="6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84" Feb 20 08:38:22 crc kubenswrapper[5094]: E0220 08:38:22.851458 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84\": container with ID starting with 6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84 not found: ID does not exist" containerID="6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84" Feb 20 08:38:22 crc kubenswrapper[5094]: I0220 08:38:22.851560 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84"} err="failed to get container status \"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84\": rpc error: code = NotFound desc = could not find container \"6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84\": container with ID starting with 6dc775b2c5f76a93f5e36b63c468418d60dce69b4e0ea3323ccd0a9f35c92e84 not found: ID does not exist" Feb 20 08:38:23 crc kubenswrapper[5094]: I0220 08:38:23.851454 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" path="/var/lib/kubelet/pods/6f4c545b-01fc-4e08-994c-7d24a10a963e/volumes" Feb 20 08:38:27 crc kubenswrapper[5094]: E0220 08:38:27.936901 5094 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.188:34014->38.102.83.188:42807: write tcp 38.102.83.188:34014->38.102.83.188:42807: write: broken pipe Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.291866 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 20 08:38:29 crc kubenswrapper[5094]: E0220 08:38:29.292336 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="dnsmasq-dns" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.292356 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="dnsmasq-dns" Feb 20 08:38:29 crc kubenswrapper[5094]: E0220 08:38:29.292391 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="init" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.292399 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="init" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.292585 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4c545b-01fc-4e08-994c-7d24a10a963e" containerName="dnsmasq-dns" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.294082 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.295891 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.295917 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-g4jlf" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.297190 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.306933 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.408601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-scripts\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.409041 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-config\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.409074 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.409148 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znfq5\" (UniqueName: \"kubernetes.io/projected/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-kube-api-access-znfq5\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.409202 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510362 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510457 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znfq5\" (UniqueName: \"kubernetes.io/projected/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-kube-api-access-znfq5\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510481 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510538 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-scripts\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510566 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-config\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.510954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.512800 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-scripts\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.512815 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-config\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.517468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.526461 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znfq5\" (UniqueName: \"kubernetes.io/projected/4bd6bf8e-8e67-4de1-a294-6b5d50f1797a-kube-api-access-znfq5\") pod \"ovn-northd-0\" (UID: \"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a\") " pod="openstack/ovn-northd-0" Feb 20 08:38:29 crc kubenswrapper[5094]: I0220 08:38:29.616066 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 20 08:38:30 crc kubenswrapper[5094]: I0220 08:38:30.187788 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 20 08:38:30 crc kubenswrapper[5094]: I0220 08:38:30.908025 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a","Type":"ContainerStarted","Data":"123d94f8371eb846e916485504943df8a24d926aa8884f217c14b7c3e87aac28"} Feb 20 08:38:31 crc kubenswrapper[5094]: I0220 08:38:31.918104 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a","Type":"ContainerStarted","Data":"0c180cd6c00e3610429ef6452551fc6bec59df4886e740d49346a96f85490f71"} Feb 20 08:38:31 crc kubenswrapper[5094]: I0220 08:38:31.918478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"4bd6bf8e-8e67-4de1-a294-6b5d50f1797a","Type":"ContainerStarted","Data":"a3978845dbbac7c8217dc5684fa73e1b55aaa287ef2ea2f0928de9dc6434e1ab"} Feb 20 08:38:31 crc kubenswrapper[5094]: I0220 08:38:31.919863 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 20 08:38:31 crc kubenswrapper[5094]: I0220 08:38:31.939442 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.27140524 podStartE2EDuration="2.939408957s" podCreationTimestamp="2026-02-20 08:38:29 +0000 UTC" firstStartedPulling="2026-02-20 08:38:30.200213938 +0000 UTC m=+6725.072840649" lastFinishedPulling="2026-02-20 08:38:30.868217665 +0000 UTC m=+6725.740844366" observedRunningTime="2026-02-20 08:38:31.934154291 +0000 UTC m=+6726.806781052" watchObservedRunningTime="2026-02-20 08:38:31.939408957 +0000 UTC m=+6726.812035708" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.849845 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.851634 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.858396 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.943117 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.944157 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.946373 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.956844 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.957299 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:37 crc kubenswrapper[5094]: I0220 08:38:37.957489 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.058656 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.058747 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.058802 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.058848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.059495 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.076176 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") pod \"keystone-db-create-l2kgb\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.159831 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.160289 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.161081 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.170007 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.181694 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") pod \"keystone-5af6-account-create-update-9f77r\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.260002 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:38 crc kubenswrapper[5094]: W0220 08:38:38.613502 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec59a7fc_e360_4e39_8c57_cfaa43d23566.slice/crio-7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238 WatchSource:0}: Error finding container 7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238: Status 404 returned error can't find the container with id 7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238 Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.616136 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.703316 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:38:38 crc kubenswrapper[5094]: W0220 08:38:38.712928 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d1678de_0344_47d5_98bb_d9ffd63912e7.slice/crio-f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a WatchSource:0}: Error finding container f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a: Status 404 returned error can't find the container with id f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.974641 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5af6-account-create-update-9f77r" event={"ID":"1d1678de-0344-47d5-98bb-d9ffd63912e7","Type":"ContainerStarted","Data":"f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a"} Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.976187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l2kgb" event={"ID":"ec59a7fc-e360-4e39-8c57-cfaa43d23566","Type":"ContainerStarted","Data":"cd80f63c99e6776fe7e02e1fb80f9d75dc8541b9606921e566c011df6c0e65ac"} Feb 20 08:38:38 crc kubenswrapper[5094]: I0220 08:38:38.976262 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l2kgb" event={"ID":"ec59a7fc-e360-4e39-8c57-cfaa43d23566","Type":"ContainerStarted","Data":"7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238"} Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.004861 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-l2kgb" podStartSLOduration=2.004828478 podStartE2EDuration="2.004828478s" podCreationTimestamp="2026-02-20 08:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:38:38.995307639 +0000 UTC m=+6733.867934390" watchObservedRunningTime="2026-02-20 08:38:39.004828478 +0000 UTC m=+6733.877455199" Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.990397 5094 generic.go:334] "Generic (PLEG): container finished" podID="1d1678de-0344-47d5-98bb-d9ffd63912e7" containerID="52db5b53565602a22b540482712ac73023427fa1b0c5c5dd0a43d58c9fbc73b5" exitCode=0 Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.990603 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5af6-account-create-update-9f77r" event={"ID":"1d1678de-0344-47d5-98bb-d9ffd63912e7","Type":"ContainerDied","Data":"52db5b53565602a22b540482712ac73023427fa1b0c5c5dd0a43d58c9fbc73b5"} Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.994458 5094 generic.go:334] "Generic (PLEG): container finished" podID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" containerID="cd80f63c99e6776fe7e02e1fb80f9d75dc8541b9606921e566c011df6c0e65ac" exitCode=0 Feb 20 08:38:39 crc kubenswrapper[5094]: I0220 08:38:39.994527 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l2kgb" event={"ID":"ec59a7fc-e360-4e39-8c57-cfaa43d23566","Type":"ContainerDied","Data":"cd80f63c99e6776fe7e02e1fb80f9d75dc8541b9606921e566c011df6c0e65ac"} Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.383068 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.389211 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.426497 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") pod \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.426549 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") pod \"1d1678de-0344-47d5-98bb-d9ffd63912e7\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.426582 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") pod \"1d1678de-0344-47d5-98bb-d9ffd63912e7\" (UID: \"1d1678de-0344-47d5-98bb-d9ffd63912e7\") " Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.426639 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") pod \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\" (UID: \"ec59a7fc-e360-4e39-8c57-cfaa43d23566\") " Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.427469 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ec59a7fc-e360-4e39-8c57-cfaa43d23566" (UID: "ec59a7fc-e360-4e39-8c57-cfaa43d23566"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.427488 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d1678de-0344-47d5-98bb-d9ffd63912e7" (UID: "1d1678de-0344-47d5-98bb-d9ffd63912e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.441074 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s" (OuterVolumeSpecName: "kube-api-access-7wj4s") pod "ec59a7fc-e360-4e39-8c57-cfaa43d23566" (UID: "ec59a7fc-e360-4e39-8c57-cfaa43d23566"). InnerVolumeSpecName "kube-api-access-7wj4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.441164 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5" (OuterVolumeSpecName: "kube-api-access-2g9h5") pod "1d1678de-0344-47d5-98bb-d9ffd63912e7" (UID: "1d1678de-0344-47d5-98bb-d9ffd63912e7"). InnerVolumeSpecName "kube-api-access-2g9h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.528375 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ec59a7fc-e360-4e39-8c57-cfaa43d23566-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.528406 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d1678de-0344-47d5-98bb-d9ffd63912e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.528417 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g9h5\" (UniqueName: \"kubernetes.io/projected/1d1678de-0344-47d5-98bb-d9ffd63912e7-kube-api-access-2g9h5\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:41 crc kubenswrapper[5094]: I0220 08:38:41.528427 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wj4s\" (UniqueName: \"kubernetes.io/projected/ec59a7fc-e360-4e39-8c57-cfaa43d23566-kube-api-access-7wj4s\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.009316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5af6-account-create-update-9f77r" event={"ID":"1d1678de-0344-47d5-98bb-d9ffd63912e7","Type":"ContainerDied","Data":"f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a"} Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.009359 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7c8fee7f1cfc445472a648944b8f6e3d207b38fb0473114436526e8f7d1d17a" Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.009363 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5af6-account-create-update-9f77r" Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.010674 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-l2kgb" event={"ID":"ec59a7fc-e360-4e39-8c57-cfaa43d23566","Type":"ContainerDied","Data":"7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238"} Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.010711 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d96dce0fca24fe420e2d9aacc437cea985c151685fc2474b7c889a87e6ea238" Feb 20 08:38:42 crc kubenswrapper[5094]: I0220 08:38:42.010765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-l2kgb" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.415484 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:38:43 crc kubenswrapper[5094]: E0220 08:38:43.416803 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" containerName="mariadb-database-create" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.416878 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" containerName="mariadb-database-create" Feb 20 08:38:43 crc kubenswrapper[5094]: E0220 08:38:43.416932 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d1678de-0344-47d5-98bb-d9ffd63912e7" containerName="mariadb-account-create-update" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.416980 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d1678de-0344-47d5-98bb-d9ffd63912e7" containerName="mariadb-account-create-update" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.417204 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" containerName="mariadb-database-create" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.417279 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d1678de-0344-47d5-98bb-d9ffd63912e7" containerName="mariadb-account-create-update" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.417912 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.431404 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.434301 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.434350 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.434442 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knkl6" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.440875 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.462884 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.462942 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.463107 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.565083 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.565560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.565784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.570289 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.570441 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.581351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") pod \"keystone-db-sync-bh8cv\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:43 crc kubenswrapper[5094]: I0220 08:38:43.734265 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:44 crc kubenswrapper[5094]: I0220 08:38:44.229650 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:38:45 crc kubenswrapper[5094]: I0220 08:38:45.059120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bh8cv" event={"ID":"81601ce5-f2ae-4f57-a829-6b235b7ae4df","Type":"ContainerStarted","Data":"498c87e28265cbefbe45ff4c9051d34828ab6173c31904d7e27419e27276c4e5"} Feb 20 08:38:49 crc kubenswrapper[5094]: I0220 08:38:49.684636 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 20 08:38:50 crc kubenswrapper[5094]: I0220 08:38:50.094943 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bh8cv" event={"ID":"81601ce5-f2ae-4f57-a829-6b235b7ae4df","Type":"ContainerStarted","Data":"8731ce151a95e8d35ad8f8cec8f98989c881d304429dbe888614942996f9f454"} Feb 20 08:38:50 crc kubenswrapper[5094]: I0220 08:38:50.113767 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-bh8cv" podStartSLOduration=2.357895886 podStartE2EDuration="7.113739559s" podCreationTimestamp="2026-02-20 08:38:43 +0000 UTC" firstStartedPulling="2026-02-20 08:38:44.230173473 +0000 UTC m=+6739.102800224" lastFinishedPulling="2026-02-20 08:38:48.986017186 +0000 UTC m=+6743.858643897" observedRunningTime="2026-02-20 08:38:50.110263475 +0000 UTC m=+6744.982890196" watchObservedRunningTime="2026-02-20 08:38:50.113739559 +0000 UTC m=+6744.986366280" Feb 20 08:38:51 crc kubenswrapper[5094]: I0220 08:38:51.103484 5094 generic.go:334] "Generic (PLEG): container finished" podID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" containerID="8731ce151a95e8d35ad8f8cec8f98989c881d304429dbe888614942996f9f454" exitCode=0 Feb 20 08:38:51 crc kubenswrapper[5094]: I0220 08:38:51.103571 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bh8cv" event={"ID":"81601ce5-f2ae-4f57-a829-6b235b7ae4df","Type":"ContainerDied","Data":"8731ce151a95e8d35ad8f8cec8f98989c881d304429dbe888614942996f9f454"} Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.400298 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.508316 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") pod \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.508398 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") pod \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.508426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") pod \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\" (UID: \"81601ce5-f2ae-4f57-a829-6b235b7ae4df\") " Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.512869 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc" (OuterVolumeSpecName: "kube-api-access-s7zvc") pod "81601ce5-f2ae-4f57-a829-6b235b7ae4df" (UID: "81601ce5-f2ae-4f57-a829-6b235b7ae4df"). InnerVolumeSpecName "kube-api-access-s7zvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.530735 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "81601ce5-f2ae-4f57-a829-6b235b7ae4df" (UID: "81601ce5-f2ae-4f57-a829-6b235b7ae4df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.547039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data" (OuterVolumeSpecName: "config-data") pod "81601ce5-f2ae-4f57-a829-6b235b7ae4df" (UID: "81601ce5-f2ae-4f57-a829-6b235b7ae4df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.610404 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.610435 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81601ce5-f2ae-4f57-a829-6b235b7ae4df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:52 crc kubenswrapper[5094]: I0220 08:38:52.610477 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7zvc\" (UniqueName: \"kubernetes.io/projected/81601ce5-f2ae-4f57-a829-6b235b7ae4df-kube-api-access-s7zvc\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.121203 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-bh8cv" event={"ID":"81601ce5-f2ae-4f57-a829-6b235b7ae4df","Type":"ContainerDied","Data":"498c87e28265cbefbe45ff4c9051d34828ab6173c31904d7e27419e27276c4e5"} Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.121529 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="498c87e28265cbefbe45ff4c9051d34828ab6173c31904d7e27419e27276c4e5" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.121274 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-bh8cv" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.334408 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:38:53 crc kubenswrapper[5094]: E0220 08:38:53.334833 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" containerName="keystone-db-sync" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.334855 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" containerName="keystone-db-sync" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.335085 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" containerName="keystone-db-sync" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.336159 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.357206 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.401439 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.402465 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.406531 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.406682 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.407902 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knkl6" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.408088 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.411947 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.416526 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.420815 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.420915 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.420980 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.421011 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.421044 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522295 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522367 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522407 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522436 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522478 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522622 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522639 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522766 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.522800 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.523796 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.524288 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.525338 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.525543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.542523 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") pod \"dnsmasq-dns-77c94f6b7-twntj\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624134 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624563 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624780 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.624879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.628427 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.628677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.629443 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.629520 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.630121 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.644750 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") pod \"keystone-bootstrap-sfb9q\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.658959 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:53 crc kubenswrapper[5094]: I0220 08:38:53.719167 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:54 crc kubenswrapper[5094]: W0220 08:38:54.148447 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3b60dac_2fbe_46ba_acc9_92058e10f2d1.slice/crio-b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef WatchSource:0}: Error finding container b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef: Status 404 returned error can't find the container with id b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef Feb 20 08:38:54 crc kubenswrapper[5094]: I0220 08:38:54.148772 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:38:54 crc kubenswrapper[5094]: I0220 08:38:54.222117 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:38:54 crc kubenswrapper[5094]: W0220 08:38:54.227489 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80bba0dd_119d_47bb_8526_df2e59c5b132.slice/crio-b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743 WatchSource:0}: Error finding container b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743: Status 404 returned error can't find the container with id b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743 Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.134225 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sfb9q" event={"ID":"80bba0dd-119d-47bb-8526-df2e59c5b132","Type":"ContainerStarted","Data":"3bce3a1b265a86956611d3f0e735c171240869a005a9b1236b337c2fd156e6f9"} Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.134502 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sfb9q" event={"ID":"80bba0dd-119d-47bb-8526-df2e59c5b132","Type":"ContainerStarted","Data":"b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743"} Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.136653 5094 generic.go:334] "Generic (PLEG): container finished" podID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerID="d7d23a98ee3bcf78f157fef71692c3b42c0ccd4e53f68bb8466090a9c903b801" exitCode=0 Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.136693 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerDied","Data":"d7d23a98ee3bcf78f157fef71692c3b42c0ccd4e53f68bb8466090a9c903b801"} Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.136731 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerStarted","Data":"b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef"} Feb 20 08:38:55 crc kubenswrapper[5094]: I0220 08:38:55.156034 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-sfb9q" podStartSLOduration=2.156014187 podStartE2EDuration="2.156014187s" podCreationTimestamp="2026-02-20 08:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:38:55.150218408 +0000 UTC m=+6750.022845119" watchObservedRunningTime="2026-02-20 08:38:55.156014187 +0000 UTC m=+6750.028640898" Feb 20 08:38:56 crc kubenswrapper[5094]: I0220 08:38:56.146006 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerStarted","Data":"24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a"} Feb 20 08:38:56 crc kubenswrapper[5094]: I0220 08:38:56.169947 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" podStartSLOduration=3.16992676 podStartE2EDuration="3.16992676s" podCreationTimestamp="2026-02-20 08:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:38:56.169368887 +0000 UTC m=+6751.041995598" watchObservedRunningTime="2026-02-20 08:38:56.16992676 +0000 UTC m=+6751.042553471" Feb 20 08:38:57 crc kubenswrapper[5094]: I0220 08:38:57.155343 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:38:58 crc kubenswrapper[5094]: I0220 08:38:58.163203 5094 generic.go:334] "Generic (PLEG): container finished" podID="80bba0dd-119d-47bb-8526-df2e59c5b132" containerID="3bce3a1b265a86956611d3f0e735c171240869a005a9b1236b337c2fd156e6f9" exitCode=0 Feb 20 08:38:58 crc kubenswrapper[5094]: I0220 08:38:58.163286 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sfb9q" event={"ID":"80bba0dd-119d-47bb-8526-df2e59c5b132","Type":"ContainerDied","Data":"3bce3a1b265a86956611d3f0e735c171240869a005a9b1236b337c2fd156e6f9"} Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.475575 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633152 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633244 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633291 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633335 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633402 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.633429 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") pod \"80bba0dd-119d-47bb-8526-df2e59c5b132\" (UID: \"80bba0dd-119d-47bb-8526-df2e59c5b132\") " Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.639840 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t" (OuterVolumeSpecName: "kube-api-access-s2b5t") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "kube-api-access-s2b5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.639943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.640839 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.651555 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts" (OuterVolumeSpecName: "scripts") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.656819 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data" (OuterVolumeSpecName: "config-data") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.666536 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80bba0dd-119d-47bb-8526-df2e59c5b132" (UID: "80bba0dd-119d-47bb-8526-df2e59c5b132"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.735568 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.735876 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.736028 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.736134 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.736247 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2b5t\" (UniqueName: \"kubernetes.io/projected/80bba0dd-119d-47bb-8526-df2e59c5b132-kube-api-access-s2b5t\") on node \"crc\" DevicePath \"\"" Feb 20 08:38:59 crc kubenswrapper[5094]: I0220 08:38:59.736369 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80bba0dd-119d-47bb-8526-df2e59c5b132-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.179580 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-sfb9q" event={"ID":"80bba0dd-119d-47bb-8526-df2e59c5b132","Type":"ContainerDied","Data":"b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743"} Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.179616 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b57adec3ffe67650fa5cd28f4d1178094049974ea52c13f89f722a21ff6f9743" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.179871 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-sfb9q" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.360856 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.365848 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-sfb9q"] Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.439078 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:39:00 crc kubenswrapper[5094]: E0220 08:39:00.439787 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80bba0dd-119d-47bb-8526-df2e59c5b132" containerName="keystone-bootstrap" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.439808 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="80bba0dd-119d-47bb-8526-df2e59c5b132" containerName="keystone-bootstrap" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.440194 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="80bba0dd-119d-47bb-8526-df2e59c5b132" containerName="keystone-bootstrap" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.440919 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.448072 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.448394 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.448580 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.454470 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knkl6" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.458288 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.470655 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549152 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549207 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549285 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549302 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.549360 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.650640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.650940 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.651096 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.652091 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.652146 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.652424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.654849 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.655123 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.655759 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.656814 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.658645 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.668543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") pod \"keystone-bootstrap-869rz\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:00 crc kubenswrapper[5094]: I0220 08:39:00.762012 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:01 crc kubenswrapper[5094]: I0220 08:39:01.196244 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:39:01 crc kubenswrapper[5094]: I0220 08:39:01.853342 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80bba0dd-119d-47bb-8526-df2e59c5b132" path="/var/lib/kubelet/pods/80bba0dd-119d-47bb-8526-df2e59c5b132/volumes" Feb 20 08:39:02 crc kubenswrapper[5094]: I0220 08:39:02.203918 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-869rz" event={"ID":"5ab956a6-a68a-4da9-9065-6f09fb2a8f28","Type":"ContainerStarted","Data":"130895fd2e1c68f426f440f2c22e59759f2e0edcb44074b22c5181ef4b193c92"} Feb 20 08:39:02 crc kubenswrapper[5094]: I0220 08:39:02.203967 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-869rz" event={"ID":"5ab956a6-a68a-4da9-9065-6f09fb2a8f28","Type":"ContainerStarted","Data":"e176cf819b6fe2d61f86091fa78711313240cd2e99aa818e6e9adb05b61379d4"} Feb 20 08:39:02 crc kubenswrapper[5094]: I0220 08:39:02.233624 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-869rz" podStartSLOduration=2.233597322 podStartE2EDuration="2.233597322s" podCreationTimestamp="2026-02-20 08:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:39:02.227772052 +0000 UTC m=+6757.100398763" watchObservedRunningTime="2026-02-20 08:39:02.233597322 +0000 UTC m=+6757.106224073" Feb 20 08:39:03 crc kubenswrapper[5094]: I0220 08:39:03.660930 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:39:03 crc kubenswrapper[5094]: I0220 08:39:03.742608 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:39:03 crc kubenswrapper[5094]: I0220 08:39:03.744282 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="dnsmasq-dns" containerID="cri-o://c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43" gracePeriod=10 Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.106881 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.106935 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.224972 5094 generic.go:334] "Generic (PLEG): container finished" podID="59402637-ea41-4c78-a455-361d55c5422a" containerID="c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43" exitCode=0 Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.225038 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerDied","Data":"c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43"} Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.225062 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" event={"ID":"59402637-ea41-4c78-a455-361d55c5422a","Type":"ContainerDied","Data":"cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b"} Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.225072 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc810b39c5c9fcbe3da7e8305b82e45f0c0fd18ad2de7d721bc7779ae68beb9b" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.226805 5094 generic.go:334] "Generic (PLEG): container finished" podID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" containerID="130895fd2e1c68f426f440f2c22e59759f2e0edcb44074b22c5181ef4b193c92" exitCode=0 Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.226843 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-869rz" event={"ID":"5ab956a6-a68a-4da9-9065-6f09fb2a8f28","Type":"ContainerDied","Data":"130895fd2e1c68f426f440f2c22e59759f2e0edcb44074b22c5181ef4b193c92"} Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.233890 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310479 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310567 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310611 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310783 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.310820 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") pod \"59402637-ea41-4c78-a455-361d55c5422a\" (UID: \"59402637-ea41-4c78-a455-361d55c5422a\") " Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.327988 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr" (OuterVolumeSpecName: "kube-api-access-k5tlr") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "kube-api-access-k5tlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.354526 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.392117 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.398130 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config" (OuterVolumeSpecName: "config") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.400481 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "59402637-ea41-4c78-a455-361d55c5422a" (UID: "59402637-ea41-4c78-a455-361d55c5422a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415076 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415166 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415182 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415236 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/59402637-ea41-4c78-a455-361d55c5422a-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:04 crc kubenswrapper[5094]: I0220 08:39:04.415253 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5tlr\" (UniqueName: \"kubernetes.io/projected/59402637-ea41-4c78-a455-361d55c5422a-kube-api-access-k5tlr\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.233538 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68dcfc97c7-fzntw" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.268243 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.273234 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68dcfc97c7-fzntw"] Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.580502 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734391 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734433 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734537 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734565 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734583 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.734687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") pod \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\" (UID: \"5ab956a6-a68a-4da9-9065-6f09fb2a8f28\") " Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.739652 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts" (OuterVolumeSpecName: "scripts") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.739684 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.739843 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l" (OuterVolumeSpecName: "kube-api-access-fsn6l") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "kube-api-access-fsn6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.739891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.757264 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.761110 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data" (OuterVolumeSpecName: "config-data") pod "5ab956a6-a68a-4da9-9065-6f09fb2a8f28" (UID: "5ab956a6-a68a-4da9-9065-6f09fb2a8f28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836527 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsn6l\" (UniqueName: \"kubernetes.io/projected/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-kube-api-access-fsn6l\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836570 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836583 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836595 5094 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836610 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.836621 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ab956a6-a68a-4da9-9065-6f09fb2a8f28-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:39:05 crc kubenswrapper[5094]: I0220 08:39:05.849843 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59402637-ea41-4c78-a455-361d55c5422a" path="/var/lib/kubelet/pods/59402637-ea41-4c78-a455-361d55c5422a/volumes" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.240299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-869rz" event={"ID":"5ab956a6-a68a-4da9-9065-6f09fb2a8f28","Type":"ContainerDied","Data":"e176cf819b6fe2d61f86091fa78711313240cd2e99aa818e6e9adb05b61379d4"} Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.240336 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e176cf819b6fe2d61f86091fa78711313240cd2e99aa818e6e9adb05b61379d4" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.240347 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-869rz" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.666252 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6f444df446-vdhbp"] Feb 20 08:39:06 crc kubenswrapper[5094]: E0220 08:39:06.667129 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="dnsmasq-dns" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667157 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="dnsmasq-dns" Feb 20 08:39:06 crc kubenswrapper[5094]: E0220 08:39:06.667199 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" containerName="keystone-bootstrap" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667211 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" containerName="keystone-bootstrap" Feb 20 08:39:06 crc kubenswrapper[5094]: E0220 08:39:06.667248 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="init" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667260 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="init" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667492 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="59402637-ea41-4c78-a455-361d55c5422a" containerName="dnsmasq-dns" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.667521 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" containerName="keystone-bootstrap" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.668216 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.670362 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-knkl6" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.670831 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.677782 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.682497 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.684147 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f444df446-vdhbp"] Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749352 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-combined-ca-bundle\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749438 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-scripts\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749479 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsk9g\" (UniqueName: \"kubernetes.io/projected/167ab003-3908-4714-95b2-bfad7c1e1e00-kube-api-access-zsk9g\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749503 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-credential-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749533 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-fernet-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.749620 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-config-data\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851294 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-fernet-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851417 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-config-data\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851464 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-combined-ca-bundle\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851515 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-scripts\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851555 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsk9g\" (UniqueName: \"kubernetes.io/projected/167ab003-3908-4714-95b2-bfad7c1e1e00-kube-api-access-zsk9g\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.851580 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-credential-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.855543 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-scripts\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.855833 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-fernet-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.857042 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-combined-ca-bundle\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.857149 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-credential-keys\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.858877 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/167ab003-3908-4714-95b2-bfad7c1e1e00-config-data\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.871351 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsk9g\" (UniqueName: \"kubernetes.io/projected/167ab003-3908-4714-95b2-bfad7c1e1e00-kube-api-access-zsk9g\") pod \"keystone-6f444df446-vdhbp\" (UID: \"167ab003-3908-4714-95b2-bfad7c1e1e00\") " pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:06 crc kubenswrapper[5094]: I0220 08:39:06.986144 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:07 crc kubenswrapper[5094]: I0220 08:39:07.386606 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f444df446-vdhbp"] Feb 20 08:39:08 crc kubenswrapper[5094]: I0220 08:39:08.256084 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f444df446-vdhbp" event={"ID":"167ab003-3908-4714-95b2-bfad7c1e1e00","Type":"ContainerStarted","Data":"d8b59a4889dd70a1cfabcd0744e95db17f9a524587e18d6a759a3a34faa924f2"} Feb 20 08:39:08 crc kubenswrapper[5094]: I0220 08:39:08.256544 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:08 crc kubenswrapper[5094]: I0220 08:39:08.256559 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f444df446-vdhbp" event={"ID":"167ab003-3908-4714-95b2-bfad7c1e1e00","Type":"ContainerStarted","Data":"732f63a2ef73a7882303d350cbe2352e9029f0e590d815ed7f259e3a8763489e"} Feb 20 08:39:08 crc kubenswrapper[5094]: I0220 08:39:08.273267 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6f444df446-vdhbp" podStartSLOduration=2.273245365 podStartE2EDuration="2.273245365s" podCreationTimestamp="2026-02-20 08:39:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:39:08.272567258 +0000 UTC m=+6763.145193969" watchObservedRunningTime="2026-02-20 08:39:08.273245365 +0000 UTC m=+6763.145872076" Feb 20 08:39:34 crc kubenswrapper[5094]: I0220 08:39:34.107078 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:39:34 crc kubenswrapper[5094]: I0220 08:39:34.107897 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:39:38 crc kubenswrapper[5094]: I0220 08:39:38.474187 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6f444df446-vdhbp" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.817822 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.819335 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.825257 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.825592 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.831470 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-9882x" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.841377 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.987320 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.987382 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:42 crc kubenswrapper[5094]: I0220 08:39:42.987568 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.089653 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.089879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.089948 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.091006 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.103088 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.109486 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") pod \"openstackclient\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.140983 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:39:43 crc kubenswrapper[5094]: I0220 08:39:43.583435 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 08:39:43 crc kubenswrapper[5094]: W0220 08:39:43.593645 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4afe958_0e78_49e9_b05a_08ff4c42f602.slice/crio-f5f4f088f9e5532097f51ae10dffeeb6d9fd8dbc6bda70e84718e13d2139d04f WatchSource:0}: Error finding container f5f4f088f9e5532097f51ae10dffeeb6d9fd8dbc6bda70e84718e13d2139d04f: Status 404 returned error can't find the container with id f5f4f088f9e5532097f51ae10dffeeb6d9fd8dbc6bda70e84718e13d2139d04f Feb 20 08:39:44 crc kubenswrapper[5094]: I0220 08:39:44.589187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b4afe958-0e78-49e9-b05a-08ff4c42f602","Type":"ContainerStarted","Data":"f5f4f088f9e5532097f51ae10dffeeb6d9fd8dbc6bda70e84718e13d2139d04f"} Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.061921 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.063822 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.069679 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.213900 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.213956 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.214125 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316198 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316257 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316816 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.316856 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.335933 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") pod \"redhat-operators-z7m9s\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:50 crc kubenswrapper[5094]: I0220 08:39:50.390357 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:39:55 crc kubenswrapper[5094]: W0220 08:39:55.126951 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod223d1ebb_1681_477e_b91a_43dc2ce65d74.slice/crio-ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119 WatchSource:0}: Error finding container ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119: Status 404 returned error can't find the container with id ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119 Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.127673 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.685468 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"b4afe958-0e78-49e9-b05a-08ff4c42f602","Type":"ContainerStarted","Data":"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181"} Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.689024 5094 generic.go:334] "Generic (PLEG): container finished" podID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerID="14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6" exitCode=0 Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.689091 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerDied","Data":"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6"} Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.689119 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerStarted","Data":"ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119"} Feb 20 08:39:55 crc kubenswrapper[5094]: I0220 08:39:55.711449 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.568347089 podStartE2EDuration="13.71135276s" podCreationTimestamp="2026-02-20 08:39:42 +0000 UTC" firstStartedPulling="2026-02-20 08:39:43.600171236 +0000 UTC m=+6798.472797947" lastFinishedPulling="2026-02-20 08:39:54.743176907 +0000 UTC m=+6809.615803618" observedRunningTime="2026-02-20 08:39:55.710285384 +0000 UTC m=+6810.582912105" watchObservedRunningTime="2026-02-20 08:39:55.71135276 +0000 UTC m=+6810.583979471" Feb 20 08:39:56 crc kubenswrapper[5094]: I0220 08:39:56.700070 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerStarted","Data":"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186"} Feb 20 08:39:57 crc kubenswrapper[5094]: I0220 08:39:57.725942 5094 generic.go:334] "Generic (PLEG): container finished" podID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerID="c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186" exitCode=0 Feb 20 08:39:57 crc kubenswrapper[5094]: I0220 08:39:57.725992 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerDied","Data":"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186"} Feb 20 08:39:58 crc kubenswrapper[5094]: I0220 08:39:58.735415 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerStarted","Data":"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245"} Feb 20 08:39:58 crc kubenswrapper[5094]: I0220 08:39:58.758237 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z7m9s" podStartSLOduration=6.369382878 podStartE2EDuration="8.758215692s" podCreationTimestamp="2026-02-20 08:39:50 +0000 UTC" firstStartedPulling="2026-02-20 08:39:55.691188994 +0000 UTC m=+6810.563815735" lastFinishedPulling="2026-02-20 08:39:58.080021838 +0000 UTC m=+6812.952648549" observedRunningTime="2026-02-20 08:39:58.75067862 +0000 UTC m=+6813.623305331" watchObservedRunningTime="2026-02-20 08:39:58.758215692 +0000 UTC m=+6813.630842403" Feb 20 08:40:00 crc kubenswrapper[5094]: I0220 08:40:00.391201 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:00 crc kubenswrapper[5094]: I0220 08:40:00.391288 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:01 crc kubenswrapper[5094]: I0220 08:40:01.472522 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z7m9s" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" probeResult="failure" output=< Feb 20 08:40:01 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:40:01 crc kubenswrapper[5094]: > Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.106996 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.108104 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.108232 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.109583 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.109688 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb" gracePeriod=600 Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.797470 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb" exitCode=0 Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.797606 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb"} Feb 20 08:40:04 crc kubenswrapper[5094]: I0220 08:40:04.797977 5094 scope.go:117] "RemoveContainer" containerID="16e83e904bac6d8e9c4aec4a2426678c94e08d32bb72d0f88611b5c8327c2050" Feb 20 08:40:05 crc kubenswrapper[5094]: I0220 08:40:05.809004 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec"} Feb 20 08:40:09 crc kubenswrapper[5094]: I0220 08:40:09.451534 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" podUID="fe469d05-edeb-4d23-b06b-6bdbfc646e99" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.49:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:40:09 crc kubenswrapper[5094]: I0220 08:40:09.452816 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-dljs6" podUID="fe469d05-edeb-4d23-b06b-6bdbfc646e99" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.49:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:40:10 crc kubenswrapper[5094]: I0220 08:40:10.458531 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:10 crc kubenswrapper[5094]: I0220 08:40:10.540136 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:10 crc kubenswrapper[5094]: I0220 08:40:10.732065 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:40:12 crc kubenswrapper[5094]: I0220 08:40:12.460086 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z7m9s" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" containerID="cri-o://6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" gracePeriod=2 Feb 20 08:40:12 crc kubenswrapper[5094]: I0220 08:40:12.880405 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.001995 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") pod \"223d1ebb-1681-477e-b91a-43dc2ce65d74\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.002044 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") pod \"223d1ebb-1681-477e-b91a-43dc2ce65d74\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.002100 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") pod \"223d1ebb-1681-477e-b91a-43dc2ce65d74\" (UID: \"223d1ebb-1681-477e-b91a-43dc2ce65d74\") " Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.003802 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities" (OuterVolumeSpecName: "utilities") pod "223d1ebb-1681-477e-b91a-43dc2ce65d74" (UID: "223d1ebb-1681-477e-b91a-43dc2ce65d74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.005492 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.008906 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9" (OuterVolumeSpecName: "kube-api-access-hdcb9") pod "223d1ebb-1681-477e-b91a-43dc2ce65d74" (UID: "223d1ebb-1681-477e-b91a-43dc2ce65d74"). InnerVolumeSpecName "kube-api-access-hdcb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.108161 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdcb9\" (UniqueName: \"kubernetes.io/projected/223d1ebb-1681-477e-b91a-43dc2ce65d74-kube-api-access-hdcb9\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.163925 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "223d1ebb-1681-477e-b91a-43dc2ce65d74" (UID: "223d1ebb-1681-477e-b91a-43dc2ce65d74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.209137 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/223d1ebb-1681-477e-b91a-43dc2ce65d74-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475278 5094 generic.go:334] "Generic (PLEG): container finished" podID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerID="6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" exitCode=0 Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475401 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z7m9s" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475396 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerDied","Data":"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245"} Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475956 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z7m9s" event={"ID":"223d1ebb-1681-477e-b91a-43dc2ce65d74","Type":"ContainerDied","Data":"ae06670f68a94f466d0e5d0b0bc69e3312f585a59895b367fe60d78b78182119"} Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.475983 5094 scope.go:117] "RemoveContainer" containerID="6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.506438 5094 scope.go:117] "RemoveContainer" containerID="c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.528040 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.544437 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z7m9s"] Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.564940 5094 scope.go:117] "RemoveContainer" containerID="14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.595352 5094 scope.go:117] "RemoveContainer" containerID="6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" Feb 20 08:40:13 crc kubenswrapper[5094]: E0220 08:40:13.596320 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245\": container with ID starting with 6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245 not found: ID does not exist" containerID="6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.596386 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245"} err="failed to get container status \"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245\": rpc error: code = NotFound desc = could not find container \"6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245\": container with ID starting with 6398de9a8c4eab7d55cf276621ec04d25a41e78ffe65c9b84589bd748f48f245 not found: ID does not exist" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.596424 5094 scope.go:117] "RemoveContainer" containerID="c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186" Feb 20 08:40:13 crc kubenswrapper[5094]: E0220 08:40:13.597066 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186\": container with ID starting with c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186 not found: ID does not exist" containerID="c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.597111 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186"} err="failed to get container status \"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186\": rpc error: code = NotFound desc = could not find container \"c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186\": container with ID starting with c074de8d9209e605a1080b4e00b6dafcd4ebcb435aaf09dbb76bc4d663278186 not found: ID does not exist" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.597139 5094 scope.go:117] "RemoveContainer" containerID="14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6" Feb 20 08:40:13 crc kubenswrapper[5094]: E0220 08:40:13.597415 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6\": container with ID starting with 14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6 not found: ID does not exist" containerID="14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.597447 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6"} err="failed to get container status \"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6\": rpc error: code = NotFound desc = could not find container \"14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6\": container with ID starting with 14f85aca3f8bff89ae81140f75bdd696b9244b57f19d94ca92a8e902738cc1a6 not found: ID does not exist" Feb 20 08:40:13 crc kubenswrapper[5094]: I0220 08:40:13.853852 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" path="/var/lib/kubelet/pods/223d1ebb-1681-477e-b91a-43dc2ce65d74/volumes" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.337323 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:44 crc kubenswrapper[5094]: E0220 08:40:44.338944 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="extract-content" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.338979 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="extract-content" Feb 20 08:40:44 crc kubenswrapper[5094]: E0220 08:40:44.339027 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.339039 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" Feb 20 08:40:44 crc kubenswrapper[5094]: E0220 08:40:44.339083 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="extract-utilities" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.339097 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="extract-utilities" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.339664 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="223d1ebb-1681-477e-b91a-43dc2ce65d74" containerName="registry-server" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.354977 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.355163 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.503132 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.503180 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.503221 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604241 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604292 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604339 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604917 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.604926 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.635986 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") pod \"certified-operators-sq9f8\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:44 crc kubenswrapper[5094]: I0220 08:40:44.689492 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:45 crc kubenswrapper[5094]: I0220 08:40:45.196028 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:45 crc kubenswrapper[5094]: I0220 08:40:45.777326 5094 generic.go:334] "Generic (PLEG): container finished" podID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerID="e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0" exitCode=0 Feb 20 08:40:45 crc kubenswrapper[5094]: I0220 08:40:45.777381 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerDied","Data":"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0"} Feb 20 08:40:45 crc kubenswrapper[5094]: I0220 08:40:45.777406 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerStarted","Data":"f38c2ad1ee049bab464b7e571d472f3a493e9dc3b0801372a7334e7bc03d4f4c"} Feb 20 08:40:46 crc kubenswrapper[5094]: I0220 08:40:46.794546 5094 generic.go:334] "Generic (PLEG): container finished" podID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerID="48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761" exitCode=0 Feb 20 08:40:46 crc kubenswrapper[5094]: I0220 08:40:46.794633 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerDied","Data":"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761"} Feb 20 08:40:47 crc kubenswrapper[5094]: I0220 08:40:47.810562 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerStarted","Data":"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c"} Feb 20 08:40:47 crc kubenswrapper[5094]: I0220 08:40:47.835567 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sq9f8" podStartSLOduration=2.4335994579999998 podStartE2EDuration="3.83554649s" podCreationTimestamp="2026-02-20 08:40:44 +0000 UTC" firstStartedPulling="2026-02-20 08:40:45.779647819 +0000 UTC m=+6860.652274570" lastFinishedPulling="2026-02-20 08:40:47.181594891 +0000 UTC m=+6862.054221602" observedRunningTime="2026-02-20 08:40:47.829969396 +0000 UTC m=+6862.702596107" watchObservedRunningTime="2026-02-20 08:40:47.83554649 +0000 UTC m=+6862.708173201" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.691406 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.692101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.735823 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.932554 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:54 crc kubenswrapper[5094]: I0220 08:40:54.992721 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:56 crc kubenswrapper[5094]: I0220 08:40:56.891075 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sq9f8" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="registry-server" containerID="cri-o://9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" gracePeriod=2 Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.302449 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.426891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") pod \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.426948 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") pod \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.427019 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") pod \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\" (UID: \"1220b209-cf9a-473e-8c43-e3fbd4ead7ee\") " Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.427951 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities" (OuterVolumeSpecName: "utilities") pod "1220b209-cf9a-473e-8c43-e3fbd4ead7ee" (UID: "1220b209-cf9a-473e-8c43-e3fbd4ead7ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.432554 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6" (OuterVolumeSpecName: "kube-api-access-kgps6") pod "1220b209-cf9a-473e-8c43-e3fbd4ead7ee" (UID: "1220b209-cf9a-473e-8c43-e3fbd4ead7ee"). InnerVolumeSpecName "kube-api-access-kgps6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.518341 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1220b209-cf9a-473e-8c43-e3fbd4ead7ee" (UID: "1220b209-cf9a-473e-8c43-e3fbd4ead7ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.529044 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.529077 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.529089 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgps6\" (UniqueName: \"kubernetes.io/projected/1220b209-cf9a-473e-8c43-e3fbd4ead7ee-kube-api-access-kgps6\") on node \"crc\" DevicePath \"\"" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.899822 5094 generic.go:334] "Generic (PLEG): container finished" podID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerID="9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" exitCode=0 Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.899869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerDied","Data":"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c"} Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.900173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sq9f8" event={"ID":"1220b209-cf9a-473e-8c43-e3fbd4ead7ee","Type":"ContainerDied","Data":"f38c2ad1ee049bab464b7e571d472f3a493e9dc3b0801372a7334e7bc03d4f4c"} Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.900019 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sq9f8" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.901112 5094 scope.go:117] "RemoveContainer" containerID="9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.943867 5094 scope.go:117] "RemoveContainer" containerID="48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761" Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.943989 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.952987 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sq9f8"] Feb 20 08:40:57 crc kubenswrapper[5094]: I0220 08:40:57.975653 5094 scope.go:117] "RemoveContainer" containerID="e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.016795 5094 scope.go:117] "RemoveContainer" containerID="9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" Feb 20 08:40:58 crc kubenswrapper[5094]: E0220 08:40:58.017110 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c\": container with ID starting with 9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c not found: ID does not exist" containerID="9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.017148 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c"} err="failed to get container status \"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c\": rpc error: code = NotFound desc = could not find container \"9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c\": container with ID starting with 9c2ad71cd007d4580cdda3852435a5f8e42251cab198cc6edfcc601ea5c9cf6c not found: ID does not exist" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.017167 5094 scope.go:117] "RemoveContainer" containerID="48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761" Feb 20 08:40:58 crc kubenswrapper[5094]: E0220 08:40:58.017681 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761\": container with ID starting with 48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761 not found: ID does not exist" containerID="48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.017767 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761"} err="failed to get container status \"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761\": rpc error: code = NotFound desc = could not find container \"48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761\": container with ID starting with 48a7dde60317d8dc89c284c17c89d10ba212905b23f5c8b567ef7dfc86280761 not found: ID does not exist" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.017789 5094 scope.go:117] "RemoveContainer" containerID="e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0" Feb 20 08:40:58 crc kubenswrapper[5094]: E0220 08:40:58.018000 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0\": container with ID starting with e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0 not found: ID does not exist" containerID="e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0" Feb 20 08:40:58 crc kubenswrapper[5094]: I0220 08:40:58.018023 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0"} err="failed to get container status \"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0\": rpc error: code = NotFound desc = could not find container \"e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0\": container with ID starting with e4d0143a0bfc23a67fe065f30ca7bef32ef97af74469926b449f1947e4dc0ff0 not found: ID does not exist" Feb 20 08:40:59 crc kubenswrapper[5094]: I0220 08:40:59.855885 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" path="/var/lib/kubelet/pods/1220b209-cf9a-473e-8c43-e3fbd4ead7ee/volumes" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.494288 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:41:21 crc kubenswrapper[5094]: E0220 08:41:21.496405 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="registry-server" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.496498 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="registry-server" Feb 20 08:41:21 crc kubenswrapper[5094]: E0220 08:41:21.496598 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="extract-content" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.496685 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="extract-content" Feb 20 08:41:21 crc kubenswrapper[5094]: E0220 08:41:21.496786 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="extract-utilities" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.496852 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="extract-utilities" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.497107 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1220b209-cf9a-473e-8c43-e3fbd4ead7ee" containerName="registry-server" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.497918 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.505072 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.506929 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.509073 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.512924 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.520780 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.592309 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.592632 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.592771 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.592910 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.694814 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.694990 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.695101 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.695213 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.695765 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.696777 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.714448 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") pod \"barbican-db-create-2tqmv\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.718293 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") pod \"barbican-0a46-account-create-update-6bs8s\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.822755 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:21 crc kubenswrapper[5094]: I0220 08:41:21.840723 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:22 crc kubenswrapper[5094]: I0220 08:41:22.320931 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:41:22 crc kubenswrapper[5094]: W0220 08:41:22.330391 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb71d5b0_a19d_4900_be92_77b1abeaf856.slice/crio-7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a WatchSource:0}: Error finding container 7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a: Status 404 returned error can't find the container with id 7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a Feb 20 08:41:22 crc kubenswrapper[5094]: I0220 08:41:22.371032 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.119381 5094 generic.go:334] "Generic (PLEG): container finished" podID="eb71d5b0-a19d-4900-be92-77b1abeaf856" containerID="4f87cc562d40739a0734989e8f19246c6cf1e1144b307f5249bd8e950afcfbb0" exitCode=0 Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.119470 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a46-account-create-update-6bs8s" event={"ID":"eb71d5b0-a19d-4900-be92-77b1abeaf856","Type":"ContainerDied","Data":"4f87cc562d40739a0734989e8f19246c6cf1e1144b307f5249bd8e950afcfbb0"} Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.119505 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a46-account-create-update-6bs8s" event={"ID":"eb71d5b0-a19d-4900-be92-77b1abeaf856","Type":"ContainerStarted","Data":"7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a"} Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.121471 5094 generic.go:334] "Generic (PLEG): container finished" podID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" containerID="4231927e6f52319c4c7cbbaa5766e18430942afbbae151ea27a85c1b2eed2b12" exitCode=0 Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.121515 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tqmv" event={"ID":"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2","Type":"ContainerDied","Data":"4231927e6f52319c4c7cbbaa5766e18430942afbbae151ea27a85c1b2eed2b12"} Feb 20 08:41:23 crc kubenswrapper[5094]: I0220 08:41:23.121537 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tqmv" event={"ID":"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2","Type":"ContainerStarted","Data":"b22b1fea60091c52338c1c42e8f642f2a793eb4270ff5da8d70b8a55ab170ead"} Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.583040 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.591579 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.647551 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") pod \"eb71d5b0-a19d-4900-be92-77b1abeaf856\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.647942 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") pod \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.648030 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") pod \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\" (UID: \"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2\") " Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.648147 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") pod \"eb71d5b0-a19d-4900-be92-77b1abeaf856\" (UID: \"eb71d5b0-a19d-4900-be92-77b1abeaf856\") " Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.648569 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" (UID: "61baa8c9-4a07-47bc-94c8-b7e3f3846ff2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.648797 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb71d5b0-a19d-4900-be92-77b1abeaf856" (UID: "eb71d5b0-a19d-4900-be92-77b1abeaf856"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.654069 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk" (OuterVolumeSpecName: "kube-api-access-2dvzk") pod "61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" (UID: "61baa8c9-4a07-47bc-94c8-b7e3f3846ff2"). InnerVolumeSpecName "kube-api-access-2dvzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.654148 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2" (OuterVolumeSpecName: "kube-api-access-bpgk2") pod "eb71d5b0-a19d-4900-be92-77b1abeaf856" (UID: "eb71d5b0-a19d-4900-be92-77b1abeaf856"). InnerVolumeSpecName "kube-api-access-bpgk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.749430 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpgk2\" (UniqueName: \"kubernetes.io/projected/eb71d5b0-a19d-4900-be92-77b1abeaf856-kube-api-access-bpgk2\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.749460 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.749469 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dvzk\" (UniqueName: \"kubernetes.io/projected/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2-kube-api-access-2dvzk\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:24 crc kubenswrapper[5094]: I0220 08:41:24.749479 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb71d5b0-a19d-4900-be92-77b1abeaf856-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.144035 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2tqmv" event={"ID":"61baa8c9-4a07-47bc-94c8-b7e3f3846ff2","Type":"ContainerDied","Data":"b22b1fea60091c52338c1c42e8f642f2a793eb4270ff5da8d70b8a55ab170ead"} Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.144098 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b22b1fea60091c52338c1c42e8f642f2a793eb4270ff5da8d70b8a55ab170ead" Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.144140 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2tqmv" Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.145425 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-0a46-account-create-update-6bs8s" event={"ID":"eb71d5b0-a19d-4900-be92-77b1abeaf856","Type":"ContainerDied","Data":"7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a"} Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.145470 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bbef8849fb28ef72e101d5f6b345cb48ee27d5c052f39c6a32091246ba8624a" Feb 20 08:41:25 crc kubenswrapper[5094]: I0220 08:41:25.145485 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-0a46-account-create-update-6bs8s" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.798218 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:41:26 crc kubenswrapper[5094]: E0220 08:41:26.798945 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" containerName="mariadb-database-create" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.798961 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" containerName="mariadb-database-create" Feb 20 08:41:26 crc kubenswrapper[5094]: E0220 08:41:26.798988 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb71d5b0-a19d-4900-be92-77b1abeaf856" containerName="mariadb-account-create-update" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.798998 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb71d5b0-a19d-4900-be92-77b1abeaf856" containerName="mariadb-account-create-update" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.799200 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" containerName="mariadb-database-create" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.799231 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb71d5b0-a19d-4900-be92-77b1abeaf856" containerName="mariadb-account-create-update" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.799953 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.802027 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6xjzv" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.802336 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.829405 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.886739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.887066 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.887203 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.988870 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.989113 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:26 crc kubenswrapper[5094]: I0220 08:41:26.989210 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.009830 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.011660 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.012333 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") pod \"barbican-db-sync-9jfqj\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.121339 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:27 crc kubenswrapper[5094]: I0220 08:41:27.600226 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:41:28 crc kubenswrapper[5094]: I0220 08:41:28.176150 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jfqj" event={"ID":"7b241ede-085a-44b3-857b-f64e36b7b14f","Type":"ContainerStarted","Data":"8c39505d2d712b06c3ef0abbde14162515e52096b4c8288ab6a391e5274a44d7"} Feb 20 08:41:32 crc kubenswrapper[5094]: I0220 08:41:32.223809 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jfqj" event={"ID":"7b241ede-085a-44b3-857b-f64e36b7b14f","Type":"ContainerStarted","Data":"ba2973a00772608356b5c6835b97549aca8d7d662bffa9fb35da2b972f2aa0f4"} Feb 20 08:41:32 crc kubenswrapper[5094]: I0220 08:41:32.257466 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9jfqj" podStartSLOduration=2.299088501 podStartE2EDuration="6.257437292s" podCreationTimestamp="2026-02-20 08:41:26 +0000 UTC" firstStartedPulling="2026-02-20 08:41:27.607967628 +0000 UTC m=+6902.480594339" lastFinishedPulling="2026-02-20 08:41:31.566316409 +0000 UTC m=+6906.438943130" observedRunningTime="2026-02-20 08:41:32.251147691 +0000 UTC m=+6907.123774442" watchObservedRunningTime="2026-02-20 08:41:32.257437292 +0000 UTC m=+6907.130064053" Feb 20 08:41:34 crc kubenswrapper[5094]: I0220 08:41:34.254202 5094 generic.go:334] "Generic (PLEG): container finished" podID="7b241ede-085a-44b3-857b-f64e36b7b14f" containerID="ba2973a00772608356b5c6835b97549aca8d7d662bffa9fb35da2b972f2aa0f4" exitCode=0 Feb 20 08:41:34 crc kubenswrapper[5094]: I0220 08:41:34.254422 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jfqj" event={"ID":"7b241ede-085a-44b3-857b-f64e36b7b14f","Type":"ContainerDied","Data":"ba2973a00772608356b5c6835b97549aca8d7d662bffa9fb35da2b972f2aa0f4"} Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.590093 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.657272 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") pod \"7b241ede-085a-44b3-857b-f64e36b7b14f\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.657375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") pod \"7b241ede-085a-44b3-857b-f64e36b7b14f\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.657412 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") pod \"7b241ede-085a-44b3-857b-f64e36b7b14f\" (UID: \"7b241ede-085a-44b3-857b-f64e36b7b14f\") " Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.662625 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7b241ede-085a-44b3-857b-f64e36b7b14f" (UID: "7b241ede-085a-44b3-857b-f64e36b7b14f"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.663037 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf" (OuterVolumeSpecName: "kube-api-access-9hhmf") pod "7b241ede-085a-44b3-857b-f64e36b7b14f" (UID: "7b241ede-085a-44b3-857b-f64e36b7b14f"). InnerVolumeSpecName "kube-api-access-9hhmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.683981 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b241ede-085a-44b3-857b-f64e36b7b14f" (UID: "7b241ede-085a-44b3-857b-f64e36b7b14f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.759488 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.759527 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b241ede-085a-44b3-857b-f64e36b7b14f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:35 crc kubenswrapper[5094]: I0220 08:41:35.759538 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hhmf\" (UniqueName: \"kubernetes.io/projected/7b241ede-085a-44b3-857b-f64e36b7b14f-kube-api-access-9hhmf\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.276985 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9jfqj" event={"ID":"7b241ede-085a-44b3-857b-f64e36b7b14f","Type":"ContainerDied","Data":"8c39505d2d712b06c3ef0abbde14162515e52096b4c8288ab6a391e5274a44d7"} Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.277049 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c39505d2d712b06c3ef0abbde14162515e52096b4c8288ab6a391e5274a44d7" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.277015 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9jfqj" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.516839 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-84b95bd745-mrk5m"] Feb 20 08:41:36 crc kubenswrapper[5094]: E0220 08:41:36.517469 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b241ede-085a-44b3-857b-f64e36b7b14f" containerName="barbican-db-sync" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.517489 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b241ede-085a-44b3-857b-f64e36b7b14f" containerName="barbican-db-sync" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.517689 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b241ede-085a-44b3-857b-f64e36b7b14f" containerName="barbican-db-sync" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.518859 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.521489 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.521911 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.521919 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-6xjzv" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.552041 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84b95bd745-mrk5m"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.567436 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f6984ff88-5xqtx"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.581923 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.583898 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-combined-ca-bundle\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.583964 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data-custom\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.584012 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.584048 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13560fbf-48aa-45ac-8c10-067377d1adfa-logs\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.584145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgw9\" (UniqueName: \"kubernetes.io/projected/13560fbf-48aa-45ac-8c10-067377d1adfa-kube-api-access-tpgw9\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.586956 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.600642 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f6984ff88-5xqtx"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.619877 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.624100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.637367 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685733 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685786 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxtws\" (UniqueName: \"kubernetes.io/projected/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-kube-api-access-dxtws\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685822 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data-custom\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685842 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685883 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-combined-ca-bundle\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.685995 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data-custom\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686064 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-logs\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686104 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686134 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686204 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13560fbf-48aa-45ac-8c10-067377d1adfa-logs\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686280 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686325 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-combined-ca-bundle\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686390 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.686440 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgw9\" (UniqueName: \"kubernetes.io/projected/13560fbf-48aa-45ac-8c10-067377d1adfa-kube-api-access-tpgw9\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.687027 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13560fbf-48aa-45ac-8c10-067377d1adfa-logs\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.691341 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data-custom\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.692101 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-config-data\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.696210 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13560fbf-48aa-45ac-8c10-067377d1adfa-combined-ca-bundle\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.712437 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgw9\" (UniqueName: \"kubernetes.io/projected/13560fbf-48aa-45ac-8c10-067377d1adfa-kube-api-access-tpgw9\") pod \"barbican-worker-84b95bd745-mrk5m\" (UID: \"13560fbf-48aa-45ac-8c10-067377d1adfa\") " pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.737751 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69fdd7dd98-bm4fc"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.739098 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.743671 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.759946 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69fdd7dd98-bm4fc"] Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788840 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788892 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-combined-ca-bundle\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788933 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788955 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.788984 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789004 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp9wn\" (UniqueName: \"kubernetes.io/projected/3e777e53-5dbe-4779-bc99-90bbf12cea8f-kube-api-access-vp9wn\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789024 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxtws\" (UniqueName: \"kubernetes.io/projected/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-kube-api-access-dxtws\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789052 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data-custom\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789072 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789109 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-logs\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789134 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-combined-ca-bundle\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789157 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data-custom\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789204 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789230 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e777e53-5dbe-4779-bc99-90bbf12cea8f-logs\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.789773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.790536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-logs\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.790721 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.791667 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.791738 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.793518 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data-custom\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.795280 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-combined-ca-bundle\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.801130 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-config-data\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.813663 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") pod \"dnsmasq-dns-58dc5b9855-b6gdq\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.832187 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxtws\" (UniqueName: \"kubernetes.io/projected/b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a-kube-api-access-dxtws\") pod \"barbican-keystone-listener-7f6984ff88-5xqtx\" (UID: \"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a\") " pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.852183 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-84b95bd745-mrk5m" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data-custom\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e777e53-5dbe-4779-bc99-90bbf12cea8f-logs\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890815 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890857 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vp9wn\" (UniqueName: \"kubernetes.io/projected/3e777e53-5dbe-4779-bc99-90bbf12cea8f-kube-api-access-vp9wn\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.890930 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-combined-ca-bundle\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.893163 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e777e53-5dbe-4779-bc99-90bbf12cea8f-logs\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.894831 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-combined-ca-bundle\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.899336 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data-custom\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.900059 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e777e53-5dbe-4779-bc99-90bbf12cea8f-config-data\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.907880 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.911574 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp9wn\" (UniqueName: \"kubernetes.io/projected/3e777e53-5dbe-4779-bc99-90bbf12cea8f-kube-api-access-vp9wn\") pod \"barbican-api-69fdd7dd98-bm4fc\" (UID: \"3e777e53-5dbe-4779-bc99-90bbf12cea8f\") " pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:36 crc kubenswrapper[5094]: I0220 08:41:36.953347 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.084606 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.372644 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-84b95bd745-mrk5m"] Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.379191 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:41:37 crc kubenswrapper[5094]: W0220 08:41:37.380595 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdb5c820_23c9_42e7_9c70_d8f504f47ff5.slice/crio-30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0 WatchSource:0}: Error finding container 30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0: Status 404 returned error can't find the container with id 30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0 Feb 20 08:41:37 crc kubenswrapper[5094]: W0220 08:41:37.381084 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13560fbf_48aa_45ac_8c10_067377d1adfa.slice/crio-49b8b108407c6012bd43bb4c6fb589f3a202da485ad943015fd4518a0191ef50 WatchSource:0}: Error finding container 49b8b108407c6012bd43bb4c6fb589f3a202da485ad943015fd4518a0191ef50: Status 404 returned error can't find the container with id 49b8b108407c6012bd43bb4c6fb589f3a202da485ad943015fd4518a0191ef50 Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.474410 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f6984ff88-5xqtx"] Feb 20 08:41:37 crc kubenswrapper[5094]: W0220 08:41:37.476883 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb128e8c6_6bcb_4e4b_b648_d3f932ad0a0a.slice/crio-fcb8498d9b27364dbe5c39be0b8a85e70da8f5883c78440f5eff58c473a6b61d WatchSource:0}: Error finding container fcb8498d9b27364dbe5c39be0b8a85e70da8f5883c78440f5eff58c473a6b61d: Status 404 returned error can't find the container with id fcb8498d9b27364dbe5c39be0b8a85e70da8f5883c78440f5eff58c473a6b61d Feb 20 08:41:37 crc kubenswrapper[5094]: I0220 08:41:37.652243 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69fdd7dd98-bm4fc"] Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.302533 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69fdd7dd98-bm4fc" event={"ID":"3e777e53-5dbe-4779-bc99-90bbf12cea8f","Type":"ContainerStarted","Data":"2d550ec529357c21f1fc2dd66c42f495d790148c60f79e531c19862c0c74ae22"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.303144 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69fdd7dd98-bm4fc" event={"ID":"3e777e53-5dbe-4779-bc99-90bbf12cea8f","Type":"ContainerStarted","Data":"2e7e50414a7da88d85240944a44c96b6efa122453231347a8284e7b90a4f99c5"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.303155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69fdd7dd98-bm4fc" event={"ID":"3e777e53-5dbe-4779-bc99-90bbf12cea8f","Type":"ContainerStarted","Data":"6505a9d3fc5a6422418e299d1ee2d5edda6daae2fce25af07a0cd0dd6262e891"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.304220 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.304244 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.306679 5094 generic.go:334] "Generic (PLEG): container finished" podID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerID="faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235" exitCode=0 Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.306801 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerDied","Data":"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.306826 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerStarted","Data":"30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.309563 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" event={"ID":"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a","Type":"ContainerStarted","Data":"fcb8498d9b27364dbe5c39be0b8a85e70da8f5883c78440f5eff58c473a6b61d"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.324655 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84b95bd745-mrk5m" event={"ID":"13560fbf-48aa-45ac-8c10-067377d1adfa","Type":"ContainerStarted","Data":"49b8b108407c6012bd43bb4c6fb589f3a202da485ad943015fd4518a0191ef50"} Feb 20 08:41:38 crc kubenswrapper[5094]: I0220 08:41:38.333517 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69fdd7dd98-bm4fc" podStartSLOduration=2.333494462 podStartE2EDuration="2.333494462s" podCreationTimestamp="2026-02-20 08:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:41:38.330400098 +0000 UTC m=+6913.203026819" watchObservedRunningTime="2026-02-20 08:41:38.333494462 +0000 UTC m=+6913.206121183" Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.334317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerStarted","Data":"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.334858 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.336982 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" event={"ID":"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a","Type":"ContainerStarted","Data":"3d15e5fff529a2b063ebbf609f9d126ad00fc95612bcd84704e79dfd39910a97"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.337042 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" event={"ID":"b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a","Type":"ContainerStarted","Data":"2baab9f100ae06234de01222fcf22f1bd7f6a5d3cde7b0291340ff39c51c1896"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.339041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84b95bd745-mrk5m" event={"ID":"13560fbf-48aa-45ac-8c10-067377d1adfa","Type":"ContainerStarted","Data":"436e0386deb8a8833e0385750a15de7fac619644cccc1c8b1de2d7d3f00c3e3c"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.339132 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-84b95bd745-mrk5m" event={"ID":"13560fbf-48aa-45ac-8c10-067377d1adfa","Type":"ContainerStarted","Data":"247355ce6bfad6ca0c0d9842bdce1e5f689f717fadc309b92b2429a59f3ffa26"} Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.363394 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" podStartSLOduration=3.363374669 podStartE2EDuration="3.363374669s" podCreationTimestamp="2026-02-20 08:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:41:39.354997687 +0000 UTC m=+6914.227624438" watchObservedRunningTime="2026-02-20 08:41:39.363374669 +0000 UTC m=+6914.236001380" Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.378529 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-84b95bd745-mrk5m" podStartSLOduration=2.098396563 podStartE2EDuration="3.378510693s" podCreationTimestamp="2026-02-20 08:41:36 +0000 UTC" firstStartedPulling="2026-02-20 08:41:37.383927368 +0000 UTC m=+6912.256554079" lastFinishedPulling="2026-02-20 08:41:38.664041498 +0000 UTC m=+6913.536668209" observedRunningTime="2026-02-20 08:41:39.376212118 +0000 UTC m=+6914.248838859" watchObservedRunningTime="2026-02-20 08:41:39.378510693 +0000 UTC m=+6914.251137404" Feb 20 08:41:39 crc kubenswrapper[5094]: I0220 08:41:39.399586 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f6984ff88-5xqtx" podStartSLOduration=2.215128963 podStartE2EDuration="3.39956539s" podCreationTimestamp="2026-02-20 08:41:36 +0000 UTC" firstStartedPulling="2026-02-20 08:41:37.480461801 +0000 UTC m=+6912.353088512" lastFinishedPulling="2026-02-20 08:41:38.664898228 +0000 UTC m=+6913.537524939" observedRunningTime="2026-02-20 08:41:39.39332312 +0000 UTC m=+6914.265949871" watchObservedRunningTime="2026-02-20 08:41:39.39956539 +0000 UTC m=+6914.272192101" Feb 20 08:41:43 crc kubenswrapper[5094]: I0220 08:41:43.551770 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:44 crc kubenswrapper[5094]: I0220 08:41:44.889952 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69fdd7dd98-bm4fc" Feb 20 08:41:46 crc kubenswrapper[5094]: I0220 08:41:46.955869 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.012646 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.012935 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="dnsmasq-dns" containerID="cri-o://24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a" gracePeriod=10 Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.417126 5094 generic.go:334] "Generic (PLEG): container finished" podID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerID="24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a" exitCode=0 Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.417643 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerDied","Data":"24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a"} Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.549353 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.590792 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.590906 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.590961 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.590986 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.591043 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") pod \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\" (UID: \"e3b60dac-2fbe-46ba-acc9-92058e10f2d1\") " Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.625678 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7" (OuterVolumeSpecName: "kube-api-access-xcqs7") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "kube-api-access-xcqs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.658762 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.658819 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.659329 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.659346 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config" (OuterVolumeSpecName: "config") pod "e3b60dac-2fbe-46ba-acc9-92058e10f2d1" (UID: "e3b60dac-2fbe-46ba-acc9-92058e10f2d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692884 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692915 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692925 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692934 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcqs7\" (UniqueName: \"kubernetes.io/projected/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-kube-api-access-xcqs7\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:47 crc kubenswrapper[5094]: I0220 08:41:47.692944 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e3b60dac-2fbe-46ba-acc9-92058e10f2d1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.426843 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" event={"ID":"e3b60dac-2fbe-46ba-acc9-92058e10f2d1","Type":"ContainerDied","Data":"b9d473e434d2dbf9a062a256fefe028e1a75e895c91d6257edb1d3df1f31c7ef"} Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.427252 5094 scope.go:117] "RemoveContainer" containerID="24d6cfafc2015bc19bb6e88020cb0449cf6f98d40157ba914e379542fe8d395a" Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.426897 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77c94f6b7-twntj" Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.455291 5094 scope.go:117] "RemoveContainer" containerID="d7d23a98ee3bcf78f157fef71692c3b42c0ccd4e53f68bb8466090a9c903b801" Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.456807 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:41:48 crc kubenswrapper[5094]: I0220 08:41:48.464528 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77c94f6b7-twntj"] Feb 20 08:41:49 crc kubenswrapper[5094]: I0220 08:41:49.852430 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" path="/var/lib/kubelet/pods/e3b60dac-2fbe-46ba-acc9-92058e10f2d1/volumes" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.829102 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:41:51 crc kubenswrapper[5094]: E0220 08:41:51.829845 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="init" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.829859 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="init" Feb 20 08:41:51 crc kubenswrapper[5094]: E0220 08:41:51.829884 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="dnsmasq-dns" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.829891 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="dnsmasq-dns" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.830041 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b60dac-2fbe-46ba-acc9-92058e10f2d1" containerName="dnsmasq-dns" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.830603 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.865532 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.934229 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.935268 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.937728 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.969516 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.969611 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:51 crc kubenswrapper[5094]: I0220 08:41:51.977830 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.070756 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.070841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.071183 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.071473 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.071675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.097183 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") pod \"neutron-db-create-rbwv2\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.151458 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.174234 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.174340 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.175288 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.195059 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") pod \"neutron-fc68-account-create-update-ctxrq\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.279518 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.705485 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:41:52 crc kubenswrapper[5094]: W0220 08:41:52.710515 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f3a4acd_5b68_467c_b024_b518d0f4d27e.slice/crio-e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b WatchSource:0}: Error finding container e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b: Status 404 returned error can't find the container with id e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b Feb 20 08:41:52 crc kubenswrapper[5094]: W0220 08:41:52.846368 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8c2373d_6a69_460a_8622_d001dc53efc0.slice/crio-6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff WatchSource:0}: Error finding container 6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff: Status 404 returned error can't find the container with id 6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff Feb 20 08:41:52 crc kubenswrapper[5094]: I0220 08:41:52.857238 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.470346 5094 generic.go:334] "Generic (PLEG): container finished" podID="d8c2373d-6a69-460a-8622-d001dc53efc0" containerID="4495a0b785b56a81800453fd2516a41bac0676f202c2358f07c81e7849110742" exitCode=0 Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.470435 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc68-account-create-update-ctxrq" event={"ID":"d8c2373d-6a69-460a-8622-d001dc53efc0","Type":"ContainerDied","Data":"4495a0b785b56a81800453fd2516a41bac0676f202c2358f07c81e7849110742"} Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.470486 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc68-account-create-update-ctxrq" event={"ID":"d8c2373d-6a69-460a-8622-d001dc53efc0","Type":"ContainerStarted","Data":"6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff"} Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.472900 5094 generic.go:334] "Generic (PLEG): container finished" podID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" containerID="5af35aa0d974ec2be3d578b66402a33233be4efbd611deaf5976f2b6d54c4e72" exitCode=0 Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.472941 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rbwv2" event={"ID":"0f3a4acd-5b68-467c-b024-b518d0f4d27e","Type":"ContainerDied","Data":"5af35aa0d974ec2be3d578b66402a33233be4efbd611deaf5976f2b6d54c4e72"} Feb 20 08:41:53 crc kubenswrapper[5094]: I0220 08:41:53.472962 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rbwv2" event={"ID":"0f3a4acd-5b68-467c-b024-b518d0f4d27e","Type":"ContainerStarted","Data":"e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b"} Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.014622 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.023542 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.138652 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") pod \"d8c2373d-6a69-460a-8622-d001dc53efc0\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.138835 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") pod \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.138898 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") pod \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\" (UID: \"0f3a4acd-5b68-467c-b024-b518d0f4d27e\") " Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.139053 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") pod \"d8c2373d-6a69-460a-8622-d001dc53efc0\" (UID: \"d8c2373d-6a69-460a-8622-d001dc53efc0\") " Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.139607 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8c2373d-6a69-460a-8622-d001dc53efc0" (UID: "d8c2373d-6a69-460a-8622-d001dc53efc0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.139673 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f3a4acd-5b68-467c-b024-b518d0f4d27e" (UID: "0f3a4acd-5b68-467c-b024-b518d0f4d27e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.144126 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg" (OuterVolumeSpecName: "kube-api-access-n24tg") pod "0f3a4acd-5b68-467c-b024-b518d0f4d27e" (UID: "0f3a4acd-5b68-467c-b024-b518d0f4d27e"). InnerVolumeSpecName "kube-api-access-n24tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.146521 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns" (OuterVolumeSpecName: "kube-api-access-57dns") pod "d8c2373d-6a69-460a-8622-d001dc53efc0" (UID: "d8c2373d-6a69-460a-8622-d001dc53efc0"). InnerVolumeSpecName "kube-api-access-57dns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.242093 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57dns\" (UniqueName: \"kubernetes.io/projected/d8c2373d-6a69-460a-8622-d001dc53efc0-kube-api-access-57dns\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.242151 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8c2373d-6a69-460a-8622-d001dc53efc0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.242161 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n24tg\" (UniqueName: \"kubernetes.io/projected/0f3a4acd-5b68-467c-b024-b518d0f4d27e-kube-api-access-n24tg\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.242171 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f3a4acd-5b68-467c-b024-b518d0f4d27e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.489462 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-rbwv2" event={"ID":"0f3a4acd-5b68-467c-b024-b518d0f4d27e","Type":"ContainerDied","Data":"e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b"} Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.489882 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e274473d0fe0cf43be09d8165480434f8a2dbce8b367babc397e1b196b593a2b" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.489487 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-rbwv2" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.491557 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc68-account-create-update-ctxrq" event={"ID":"d8c2373d-6a69-460a-8622-d001dc53efc0","Type":"ContainerDied","Data":"6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff"} Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.491605 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ecb69a45ac19b4312d699ee9bdc5737b998f021d79728732b5e67f0dfd777ff" Feb 20 08:41:55 crc kubenswrapper[5094]: I0220 08:41:55.491626 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc68-account-create-update-ctxrq" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.167969 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:41:57 crc kubenswrapper[5094]: E0220 08:41:57.168350 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" containerName="mariadb-database-create" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.168364 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" containerName="mariadb-database-create" Feb 20 08:41:57 crc kubenswrapper[5094]: E0220 08:41:57.168400 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c2373d-6a69-460a-8622-d001dc53efc0" containerName="mariadb-account-create-update" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.168409 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c2373d-6a69-460a-8622-d001dc53efc0" containerName="mariadb-account-create-update" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.168587 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" containerName="mariadb-database-create" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.168600 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c2373d-6a69-460a-8622-d001dc53efc0" containerName="mariadb-account-create-update" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.169261 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.171341 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.171443 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v6cxj" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.171519 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.188396 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.281000 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.281085 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.281130 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.383319 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.383372 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.383400 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.394479 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.395369 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.407883 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") pod \"neutron-db-sync-j7lxk\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.492880 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:41:57 crc kubenswrapper[5094]: I0220 08:41:57.967760 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:41:58 crc kubenswrapper[5094]: I0220 08:41:58.516775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j7lxk" event={"ID":"bcbd09e1-8a1b-468e-9238-0691cafda43e","Type":"ContainerStarted","Data":"710b367e8a0475d2f89ae71b4ffcf7ae41da63c436f5361dbf27d4bd07bdf660"} Feb 20 08:41:58 crc kubenswrapper[5094]: I0220 08:41:58.517107 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j7lxk" event={"ID":"bcbd09e1-8a1b-468e-9238-0691cafda43e","Type":"ContainerStarted","Data":"06453b626b934944ded65ddde039629a8276c6c1665fc2259f6c73b4d5e297ff"} Feb 20 08:41:58 crc kubenswrapper[5094]: I0220 08:41:58.542898 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-j7lxk" podStartSLOduration=1.542879184 podStartE2EDuration="1.542879184s" podCreationTimestamp="2026-02-20 08:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:41:58.534144353 +0000 UTC m=+6933.406771064" watchObservedRunningTime="2026-02-20 08:41:58.542879184 +0000 UTC m=+6933.415505895" Feb 20 08:42:03 crc kubenswrapper[5094]: I0220 08:42:03.569734 5094 generic.go:334] "Generic (PLEG): container finished" podID="bcbd09e1-8a1b-468e-9238-0691cafda43e" containerID="710b367e8a0475d2f89ae71b4ffcf7ae41da63c436f5361dbf27d4bd07bdf660" exitCode=0 Feb 20 08:42:03 crc kubenswrapper[5094]: I0220 08:42:03.569898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j7lxk" event={"ID":"bcbd09e1-8a1b-468e-9238-0691cafda43e","Type":"ContainerDied","Data":"710b367e8a0475d2f89ae71b4ffcf7ae41da63c436f5361dbf27d4bd07bdf660"} Feb 20 08:42:04 crc kubenswrapper[5094]: I0220 08:42:04.971766 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.124973 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") pod \"bcbd09e1-8a1b-468e-9238-0691cafda43e\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.125071 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") pod \"bcbd09e1-8a1b-468e-9238-0691cafda43e\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.125218 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") pod \"bcbd09e1-8a1b-468e-9238-0691cafda43e\" (UID: \"bcbd09e1-8a1b-468e-9238-0691cafda43e\") " Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.130339 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b" (OuterVolumeSpecName: "kube-api-access-xhp7b") pod "bcbd09e1-8a1b-468e-9238-0691cafda43e" (UID: "bcbd09e1-8a1b-468e-9238-0691cafda43e"). InnerVolumeSpecName "kube-api-access-xhp7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.148660 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config" (OuterVolumeSpecName: "config") pod "bcbd09e1-8a1b-468e-9238-0691cafda43e" (UID: "bcbd09e1-8a1b-468e-9238-0691cafda43e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.149816 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcbd09e1-8a1b-468e-9238-0691cafda43e" (UID: "bcbd09e1-8a1b-468e-9238-0691cafda43e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.227531 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.227571 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bcbd09e1-8a1b-468e-9238-0691cafda43e-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.227585 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhp7b\" (UniqueName: \"kubernetes.io/projected/bcbd09e1-8a1b-468e-9238-0691cafda43e-kube-api-access-xhp7b\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.590613 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-j7lxk" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.590605 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-j7lxk" event={"ID":"bcbd09e1-8a1b-468e-9238-0691cafda43e","Type":"ContainerDied","Data":"06453b626b934944ded65ddde039629a8276c6c1665fc2259f6c73b4d5e297ff"} Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.591106 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06453b626b934944ded65ddde039629a8276c6c1665fc2259f6c73b4d5e297ff" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.743732 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:42:05 crc kubenswrapper[5094]: E0220 08:42:05.744250 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcbd09e1-8a1b-468e-9238-0691cafda43e" containerName="neutron-db-sync" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.744276 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcbd09e1-8a1b-468e-9238-0691cafda43e" containerName="neutron-db-sync" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.744560 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcbd09e1-8a1b-468e-9238-0691cafda43e" containerName="neutron-db-sync" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.746094 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.762082 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839061 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839142 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839321 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839476 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.839595 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.882874 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-585ff4fdf7-llqts"] Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.884530 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.889474 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.890130 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v6cxj" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.890284 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.900263 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-585ff4fdf7-llqts"] Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.942144 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.942250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.942327 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.945387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.945684 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.946412 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.946870 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.947319 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.948526 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:05 crc kubenswrapper[5094]: I0220 08:42:05.969982 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") pod \"dnsmasq-dns-549654cbff-7zg62\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.046963 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr9kv\" (UniqueName: \"kubernetes.io/projected/78fff8ae-90d4-490d-b302-45fce0bd0101-kube-api-access-zr9kv\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.047045 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-httpd-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.047258 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.047355 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-combined-ca-bundle\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.129151 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.149424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr9kv\" (UniqueName: \"kubernetes.io/projected/78fff8ae-90d4-490d-b302-45fce0bd0101-kube-api-access-zr9kv\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.149514 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-httpd-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.149579 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.149640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-combined-ca-bundle\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.154012 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-combined-ca-bundle\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.154254 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-httpd-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.154631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/78fff8ae-90d4-490d-b302-45fce0bd0101-config\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.179648 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr9kv\" (UniqueName: \"kubernetes.io/projected/78fff8ae-90d4-490d-b302-45fce0bd0101-kube-api-access-zr9kv\") pod \"neutron-585ff4fdf7-llqts\" (UID: \"78fff8ae-90d4-490d-b302-45fce0bd0101\") " pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.216483 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.596806 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:42:06 crc kubenswrapper[5094]: W0220 08:42:06.599904 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14b1e378_3ae1_4707_8c14_3ee7ad292a55.slice/crio-4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61 WatchSource:0}: Error finding container 4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61: Status 404 returned error can't find the container with id 4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61 Feb 20 08:42:06 crc kubenswrapper[5094]: I0220 08:42:06.877462 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-585ff4fdf7-llqts"] Feb 20 08:42:06 crc kubenswrapper[5094]: W0220 08:42:06.879372 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78fff8ae_90d4_490d_b302_45fce0bd0101.slice/crio-9f0b8f65b9e3b8ce6b00bb72d6d184a0c589d3d3c50e319f06e67b43c2a422a0 WatchSource:0}: Error finding container 9f0b8f65b9e3b8ce6b00bb72d6d184a0c589d3d3c50e319f06e67b43c2a422a0: Status 404 returned error can't find the container with id 9f0b8f65b9e3b8ce6b00bb72d6d184a0c589d3d3c50e319f06e67b43c2a422a0 Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.606567 5094 generic.go:334] "Generic (PLEG): container finished" podID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerID="a8b2fd7d2d6131025e57a52935b313eca72cf89c2a15b9432a7f1f536f05a672" exitCode=0 Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.606917 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerDied","Data":"a8b2fd7d2d6131025e57a52935b313eca72cf89c2a15b9432a7f1f536f05a672"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.606943 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerStarted","Data":"4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.609694 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-585ff4fdf7-llqts" event={"ID":"78fff8ae-90d4-490d-b302-45fce0bd0101","Type":"ContainerStarted","Data":"a17ae4487e7812976ea9b81c300e3a145fb7794bb0b360255b78bde10a86c7e5"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.609813 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-585ff4fdf7-llqts" event={"ID":"78fff8ae-90d4-490d-b302-45fce0bd0101","Type":"ContainerStarted","Data":"184344a649be3445ec7813f76772cf06714a504a32568a8084582827a77a9e06"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.609826 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-585ff4fdf7-llqts" event={"ID":"78fff8ae-90d4-490d-b302-45fce0bd0101","Type":"ContainerStarted","Data":"9f0b8f65b9e3b8ce6b00bb72d6d184a0c589d3d3c50e319f06e67b43c2a422a0"} Feb 20 08:42:07 crc kubenswrapper[5094]: I0220 08:42:07.609931 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:08 crc kubenswrapper[5094]: I0220 08:42:08.620554 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerStarted","Data":"09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09"} Feb 20 08:42:08 crc kubenswrapper[5094]: I0220 08:42:08.642879 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-585ff4fdf7-llqts" podStartSLOduration=3.642854681 podStartE2EDuration="3.642854681s" podCreationTimestamp="2026-02-20 08:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:42:07.646004738 +0000 UTC m=+6942.518631459" watchObservedRunningTime="2026-02-20 08:42:08.642854681 +0000 UTC m=+6943.515481402" Feb 20 08:42:08 crc kubenswrapper[5094]: I0220 08:42:08.644616 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-549654cbff-7zg62" podStartSLOduration=3.644605483 podStartE2EDuration="3.644605483s" podCreationTimestamp="2026-02-20 08:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:42:08.6386574 +0000 UTC m=+6943.511284121" watchObservedRunningTime="2026-02-20 08:42:08.644605483 +0000 UTC m=+6943.517232204" Feb 20 08:42:09 crc kubenswrapper[5094]: I0220 08:42:09.627613 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.131849 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.206802 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.207422 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="dnsmasq-dns" containerID="cri-o://8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" gracePeriod=10 Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.684447 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698813 5094 generic.go:334] "Generic (PLEG): container finished" podID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerID="8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" exitCode=0 Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698855 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerDied","Data":"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85"} Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698879 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698891 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc5b9855-b6gdq" event={"ID":"fdb5c820-23c9-42e7-9c70-d8f504f47ff5","Type":"ContainerDied","Data":"30f5698d42ea6d5782a8328b6af34f86a1ed299fd9bf068ef4d5c02cdc98c3f0"} Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.698910 5094 scope.go:117] "RemoveContainer" containerID="8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.737653 5094 scope.go:117] "RemoveContainer" containerID="faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.759141 5094 scope.go:117] "RemoveContainer" containerID="8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.759967 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.760849 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.760924 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.760944 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.760993 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") pod \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\" (UID: \"fdb5c820-23c9-42e7-9c70-d8f504f47ff5\") " Feb 20 08:42:16 crc kubenswrapper[5094]: E0220 08:42:16.761740 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85\": container with ID starting with 8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85 not found: ID does not exist" containerID="8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.761798 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85"} err="failed to get container status \"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85\": rpc error: code = NotFound desc = could not find container \"8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85\": container with ID starting with 8bb9de6e4383ec58a31d9dfe384382f6766f2e29c7eeaa03946d4109134ffc85 not found: ID does not exist" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.761828 5094 scope.go:117] "RemoveContainer" containerID="faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235" Feb 20 08:42:16 crc kubenswrapper[5094]: E0220 08:42:16.762801 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235\": container with ID starting with faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235 not found: ID does not exist" containerID="faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.762845 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235"} err="failed to get container status \"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235\": rpc error: code = NotFound desc = could not find container \"faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235\": container with ID starting with faf72b2a28fa9464260dd616d409fc89190c54954c251671336a3448ddcf3235 not found: ID does not exist" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.767396 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp" (OuterVolumeSpecName: "kube-api-access-hpcqp") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "kube-api-access-hpcqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.805934 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.805971 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.809249 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config" (OuterVolumeSpecName: "config") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.811771 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fdb5c820-23c9-42e7-9c70-d8f504f47ff5" (UID: "fdb5c820-23c9-42e7-9c70-d8f504f47ff5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863408 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpcqp\" (UniqueName: \"kubernetes.io/projected/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-kube-api-access-hpcqp\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863442 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863452 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863462 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:16 crc kubenswrapper[5094]: I0220 08:42:16.863471 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fdb5c820-23c9-42e7-9c70-d8f504f47ff5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:17 crc kubenswrapper[5094]: I0220 08:42:17.027320 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:42:17 crc kubenswrapper[5094]: I0220 08:42:17.042322 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dc5b9855-b6gdq"] Feb 20 08:42:17 crc kubenswrapper[5094]: I0220 08:42:17.848964 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" path="/var/lib/kubelet/pods/fdb5c820-23c9-42e7-9c70-d8f504f47ff5/volumes" Feb 20 08:42:34 crc kubenswrapper[5094]: I0220 08:42:34.106920 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:42:34 crc kubenswrapper[5094]: I0220 08:42:34.107908 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:42:36 crc kubenswrapper[5094]: I0220 08:42:36.228664 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-585ff4fdf7-llqts" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.605048 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:42:43 crc kubenswrapper[5094]: E0220 08:42:43.605882 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="init" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.605902 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="init" Feb 20 08:42:43 crc kubenswrapper[5094]: E0220 08:42:43.605922 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="dnsmasq-dns" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.605931 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="dnsmasq-dns" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.606156 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb5c820-23c9-42e7-9c70-d8f504f47ff5" containerName="dnsmasq-dns" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.606855 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.623945 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.706364 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.707367 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.709500 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.722130 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.742503 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.745372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.846221 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.846410 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.846541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.846686 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.847272 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.866945 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") pod \"glance-db-create-bgw44\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.923871 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bgw44" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.948888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.949025 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.950933 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:43 crc kubenswrapper[5094]: I0220 08:42:43.971990 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") pod \"glance-80d8-account-create-update-2lhxz\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.023318 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.419878 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.524432 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:42:44 crc kubenswrapper[5094]: W0220 08:42:44.526825 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0fbd49a_25e7_44de_a81d_f324feba0dff.slice/crio-fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d WatchSource:0}: Error finding container fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d: Status 404 returned error can't find the container with id fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.940186 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bgw44" event={"ID":"5d39890b-bbcb-4fcb-9f5e-6f74782fc661","Type":"ContainerStarted","Data":"26fb92ef592af820040ca30c6c02e7ed550cd4c5268296895eb9386dd13d2c0a"} Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.940524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bgw44" event={"ID":"5d39890b-bbcb-4fcb-9f5e-6f74782fc661","Type":"ContainerStarted","Data":"e27064341d4ad31ef21d53047d3bf23db0f06583a1eb5340c5cc014514f5cf27"} Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.941802 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-80d8-account-create-update-2lhxz" event={"ID":"f0fbd49a-25e7-44de-a81d-f324feba0dff","Type":"ContainerStarted","Data":"0517d39a1199d93e907c142d0fd23dddc068f1fcc10b13d41993d7946b0ef46a"} Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.941845 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-80d8-account-create-update-2lhxz" event={"ID":"f0fbd49a-25e7-44de-a81d-f324feba0dff","Type":"ContainerStarted","Data":"fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d"} Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.959148 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-bgw44" podStartSLOduration=1.9591320749999999 podStartE2EDuration="1.959132075s" podCreationTimestamp="2026-02-20 08:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:42:44.957149057 +0000 UTC m=+6979.829775768" watchObservedRunningTime="2026-02-20 08:42:44.959132075 +0000 UTC m=+6979.831758786" Feb 20 08:42:44 crc kubenswrapper[5094]: I0220 08:42:44.979380 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-80d8-account-create-update-2lhxz" podStartSLOduration=1.979358112 podStartE2EDuration="1.979358112s" podCreationTimestamp="2026-02-20 08:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:42:44.972746932 +0000 UTC m=+6979.845373653" watchObservedRunningTime="2026-02-20 08:42:44.979358112 +0000 UTC m=+6979.851984823" Feb 20 08:42:45 crc kubenswrapper[5094]: I0220 08:42:45.952454 5094 generic.go:334] "Generic (PLEG): container finished" podID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" containerID="26fb92ef592af820040ca30c6c02e7ed550cd4c5268296895eb9386dd13d2c0a" exitCode=0 Feb 20 08:42:45 crc kubenswrapper[5094]: I0220 08:42:45.952558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bgw44" event={"ID":"5d39890b-bbcb-4fcb-9f5e-6f74782fc661","Type":"ContainerDied","Data":"26fb92ef592af820040ca30c6c02e7ed550cd4c5268296895eb9386dd13d2c0a"} Feb 20 08:42:45 crc kubenswrapper[5094]: I0220 08:42:45.953695 5094 generic.go:334] "Generic (PLEG): container finished" podID="f0fbd49a-25e7-44de-a81d-f324feba0dff" containerID="0517d39a1199d93e907c142d0fd23dddc068f1fcc10b13d41993d7946b0ef46a" exitCode=0 Feb 20 08:42:45 crc kubenswrapper[5094]: I0220 08:42:45.953751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-80d8-account-create-update-2lhxz" event={"ID":"f0fbd49a-25e7-44de-a81d-f324feba0dff","Type":"ContainerDied","Data":"0517d39a1199d93e907c142d0fd23dddc068f1fcc10b13d41993d7946b0ef46a"} Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.300765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bgw44" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.307853 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.425949 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") pod \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.426024 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") pod \"f0fbd49a-25e7-44de-a81d-f324feba0dff\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.426083 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") pod \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\" (UID: \"5d39890b-bbcb-4fcb-9f5e-6f74782fc661\") " Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.426228 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") pod \"f0fbd49a-25e7-44de-a81d-f324feba0dff\" (UID: \"f0fbd49a-25e7-44de-a81d-f324feba0dff\") " Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.428157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f0fbd49a-25e7-44de-a81d-f324feba0dff" (UID: "f0fbd49a-25e7-44de-a81d-f324feba0dff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.428269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d39890b-bbcb-4fcb-9f5e-6f74782fc661" (UID: "5d39890b-bbcb-4fcb-9f5e-6f74782fc661"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.432583 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b" (OuterVolumeSpecName: "kube-api-access-pbr6b") pod "f0fbd49a-25e7-44de-a81d-f324feba0dff" (UID: "f0fbd49a-25e7-44de-a81d-f324feba0dff"). InnerVolumeSpecName "kube-api-access-pbr6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.433215 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv" (OuterVolumeSpecName: "kube-api-access-5m5xv") pod "5d39890b-bbcb-4fcb-9f5e-6f74782fc661" (UID: "5d39890b-bbcb-4fcb-9f5e-6f74782fc661"). InnerVolumeSpecName "kube-api-access-5m5xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.528222 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbr6b\" (UniqueName: \"kubernetes.io/projected/f0fbd49a-25e7-44de-a81d-f324feba0dff-kube-api-access-pbr6b\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.528255 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m5xv\" (UniqueName: \"kubernetes.io/projected/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-kube-api-access-5m5xv\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.528267 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f0fbd49a-25e7-44de-a81d-f324feba0dff-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.528276 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d39890b-bbcb-4fcb-9f5e-6f74782fc661-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.967785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-80d8-account-create-update-2lhxz" event={"ID":"f0fbd49a-25e7-44de-a81d-f324feba0dff","Type":"ContainerDied","Data":"fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d"} Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.967825 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb729c857f4440a56e2335a339bd8e9975febc05c613c62aa627adb7ec34420d" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.967880 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-80d8-account-create-update-2lhxz" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.970545 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-bgw44" event={"ID":"5d39890b-bbcb-4fcb-9f5e-6f74782fc661","Type":"ContainerDied","Data":"e27064341d4ad31ef21d53047d3bf23db0f06583a1eb5340c5cc014514f5cf27"} Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.970574 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e27064341d4ad31ef21d53047d3bf23db0f06583a1eb5340c5cc014514f5cf27" Feb 20 08:42:47 crc kubenswrapper[5094]: I0220 08:42:47.970616 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-bgw44" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.039771 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:42:49 crc kubenswrapper[5094]: E0220 08:42:49.040394 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" containerName="mariadb-database-create" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.040407 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" containerName="mariadb-database-create" Feb 20 08:42:49 crc kubenswrapper[5094]: E0220 08:42:49.040438 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0fbd49a-25e7-44de-a81d-f324feba0dff" containerName="mariadb-account-create-update" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.040443 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0fbd49a-25e7-44de-a81d-f324feba0dff" containerName="mariadb-account-create-update" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.040591 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" containerName="mariadb-database-create" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.040606 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0fbd49a-25e7-44de-a81d-f324feba0dff" containerName="mariadb-account-create-update" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.041153 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.048100 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7xqwp" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.048324 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.065773 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.159610 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.159684 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.159840 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.160023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.261683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.261789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.261850 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.261941 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.266807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.269255 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.279954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.301884 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") pod \"glance-db-sync-2fzgg\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.366381 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fzgg" Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.858263 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:42:49 crc kubenswrapper[5094]: I0220 08:42:49.863274 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:42:50 crc kubenswrapper[5094]: I0220 08:42:50.026592 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fzgg" event={"ID":"b382ec69-4b87-43f5-b964-eba4282bcc42","Type":"ContainerStarted","Data":"32f4328b6fac64f8733dba9143d7b674a26899af5e5493def4751935ca89de98"} Feb 20 08:43:04 crc kubenswrapper[5094]: I0220 08:43:04.107127 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:43:04 crc kubenswrapper[5094]: I0220 08:43:04.107747 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:43:15 crc kubenswrapper[5094]: I0220 08:43:15.239166 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fzgg" event={"ID":"b382ec69-4b87-43f5-b964-eba4282bcc42","Type":"ContainerStarted","Data":"d77d5b604322e5a963ae828151741fdab64a97bfd2a29b72e3a01f5ffe6ac7d2"} Feb 20 08:43:15 crc kubenswrapper[5094]: I0220 08:43:15.257754 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-2fzgg" podStartSLOduration=1.667360481 podStartE2EDuration="26.257736725s" podCreationTimestamp="2026-02-20 08:42:49 +0000 UTC" firstStartedPulling="2026-02-20 08:42:49.863044053 +0000 UTC m=+6984.735670764" lastFinishedPulling="2026-02-20 08:43:14.453420307 +0000 UTC m=+7009.326047008" observedRunningTime="2026-02-20 08:43:15.254798554 +0000 UTC m=+7010.127425265" watchObservedRunningTime="2026-02-20 08:43:15.257736725 +0000 UTC m=+7010.130363436" Feb 20 08:43:18 crc kubenswrapper[5094]: I0220 08:43:18.282784 5094 generic.go:334] "Generic (PLEG): container finished" podID="b382ec69-4b87-43f5-b964-eba4282bcc42" containerID="d77d5b604322e5a963ae828151741fdab64a97bfd2a29b72e3a01f5ffe6ac7d2" exitCode=0 Feb 20 08:43:18 crc kubenswrapper[5094]: I0220 08:43:18.282874 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fzgg" event={"ID":"b382ec69-4b87-43f5-b964-eba4282bcc42","Type":"ContainerDied","Data":"d77d5b604322e5a963ae828151741fdab64a97bfd2a29b72e3a01f5ffe6ac7d2"} Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.790197 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fzgg" Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.943407 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") pod \"b382ec69-4b87-43f5-b964-eba4282bcc42\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.943498 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") pod \"b382ec69-4b87-43f5-b964-eba4282bcc42\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.943532 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") pod \"b382ec69-4b87-43f5-b964-eba4282bcc42\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.943655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") pod \"b382ec69-4b87-43f5-b964-eba4282bcc42\" (UID: \"b382ec69-4b87-43f5-b964-eba4282bcc42\") " Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.948796 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw" (OuterVolumeSpecName: "kube-api-access-nc4dw") pod "b382ec69-4b87-43f5-b964-eba4282bcc42" (UID: "b382ec69-4b87-43f5-b964-eba4282bcc42"). InnerVolumeSpecName "kube-api-access-nc4dw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.955366 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b382ec69-4b87-43f5-b964-eba4282bcc42" (UID: "b382ec69-4b87-43f5-b964-eba4282bcc42"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.966665 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b382ec69-4b87-43f5-b964-eba4282bcc42" (UID: "b382ec69-4b87-43f5-b964-eba4282bcc42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:19 crc kubenswrapper[5094]: I0220 08:43:19.988270 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data" (OuterVolumeSpecName: "config-data") pod "b382ec69-4b87-43f5-b964-eba4282bcc42" (UID: "b382ec69-4b87-43f5-b964-eba4282bcc42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.046206 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.046247 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.046260 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc4dw\" (UniqueName: \"kubernetes.io/projected/b382ec69-4b87-43f5-b964-eba4282bcc42-kube-api-access-nc4dw\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.046273 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b382ec69-4b87-43f5-b964-eba4282bcc42-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.305234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-2fzgg" event={"ID":"b382ec69-4b87-43f5-b964-eba4282bcc42","Type":"ContainerDied","Data":"32f4328b6fac64f8733dba9143d7b674a26899af5e5493def4751935ca89de98"} Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.305272 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32f4328b6fac64f8733dba9143d7b674a26899af5e5493def4751935ca89de98" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.305302 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-2fzgg" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.683813 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:20 crc kubenswrapper[5094]: E0220 08:43:20.684213 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b382ec69-4b87-43f5-b964-eba4282bcc42" containerName="glance-db-sync" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.684230 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b382ec69-4b87-43f5-b964-eba4282bcc42" containerName="glance-db-sync" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.684392 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b382ec69-4b87-43f5-b964-eba4282bcc42" containerName="glance-db-sync" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.685294 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.688204 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7xqwp" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.688630 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.688796 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.688890 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.702046 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.703677 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.716127 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.732590 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759422 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759589 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.759816 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.844928 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.846382 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.857081 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.858467 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861154 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861251 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861285 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861305 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861324 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861341 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861402 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861431 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861450 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.861865 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.862291 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.866473 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.866881 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.867477 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.871683 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.903354 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") pod \"glance-default-external-api-0\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.963976 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964054 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964074 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964098 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964170 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964218 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964234 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.964829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.965116 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.965510 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.969313 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.970333 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:20 crc kubenswrapper[5094]: I0220 08:43:20.981307 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") pod \"dnsmasq-dns-5b495df7c5-7nbzn\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.004153 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.019218 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066170 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066716 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066747 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066766 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066818 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.066865 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.067230 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.067533 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.070845 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.071452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.072411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.073120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.085429 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") pod \"glance-default-internal-api-0\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.161318 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:21 crc kubenswrapper[5094]: W0220 08:43:21.588394 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbabc907_d404_4942_a4d8_470ca14f2727.slice/crio-cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4 WatchSource:0}: Error finding container cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4: Status 404 returned error can't find the container with id cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4 Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.589059 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.606929 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:43:21 crc kubenswrapper[5094]: W0220 08:43:21.621529 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84d7a949_86f8_4325_8494_1e37848e76ec.slice/crio-9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a WatchSource:0}: Error finding container 9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a: Status 404 returned error can't find the container with id 9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.664517 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:21 crc kubenswrapper[5094]: W0220 08:43:21.797922 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6187d03_61b1_472a_815f_ca42b191010f.slice/crio-49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b WatchSource:0}: Error finding container 49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b: Status 404 returned error can't find the container with id 49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b Feb 20 08:43:21 crc kubenswrapper[5094]: I0220 08:43:21.798041 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.336793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerStarted","Data":"49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b"} Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.340340 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerStarted","Data":"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0"} Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.340388 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerStarted","Data":"cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4"} Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.343783 5094 generic.go:334] "Generic (PLEG): container finished" podID="84d7a949-86f8-4325-8494-1e37848e76ec" containerID="208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100" exitCode=0 Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.343848 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerDied","Data":"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100"} Feb 20 08:43:22 crc kubenswrapper[5094]: I0220 08:43:22.343873 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerStarted","Data":"9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.352903 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerStarted","Data":"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.353727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerStarted","Data":"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.354311 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerStarted","Data":"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.354368 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-log" containerID="cri-o://85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" gracePeriod=30 Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.354425 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-httpd" containerID="cri-o://d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" gracePeriod=30 Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.356141 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerStarted","Data":"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f"} Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.356294 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.379324 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.379306246 podStartE2EDuration="3.379306246s" podCreationTimestamp="2026-02-20 08:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:23.374006138 +0000 UTC m=+7018.246632849" watchObservedRunningTime="2026-02-20 08:43:23.379306246 +0000 UTC m=+7018.251932947" Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.403136 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.403118499 podStartE2EDuration="3.403118499s" podCreationTimestamp="2026-02-20 08:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:23.396189002 +0000 UTC m=+7018.268815713" watchObservedRunningTime="2026-02-20 08:43:23.403118499 +0000 UTC m=+7018.275745210" Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.842643 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" podStartSLOduration=3.842624417 podStartE2EDuration="3.842624417s" podCreationTimestamp="2026-02-20 08:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:23.419510714 +0000 UTC m=+7018.292137425" watchObservedRunningTime="2026-02-20 08:43:23.842624417 +0000 UTC m=+7018.715251128" Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.856801 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:23 crc kubenswrapper[5094]: I0220 08:43:23.995870 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133756 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133814 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133831 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133858 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133875 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.133947 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") pod \"fbabc907-d404-4942-a4d8-470ca14f2727\" (UID: \"fbabc907-d404-4942-a4d8-470ca14f2727\") " Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.134111 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.134325 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.134640 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs" (OuterVolumeSpecName: "logs") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.139076 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts" (OuterVolumeSpecName: "scripts") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.139714 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph" (OuterVolumeSpecName: "ceph") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.144060 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg" (OuterVolumeSpecName: "kube-api-access-g6cfg") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "kube-api-access-g6cfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.160791 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.192222 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data" (OuterVolumeSpecName: "config-data") pod "fbabc907-d404-4942-a4d8-470ca14f2727" (UID: "fbabc907-d404-4942-a4d8-470ca14f2727"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236267 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbabc907-d404-4942-a4d8-470ca14f2727-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236302 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236312 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236320 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6cfg\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-kube-api-access-g6cfg\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236331 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/fbabc907-d404-4942-a4d8-470ca14f2727-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.236339 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbabc907-d404-4942-a4d8-470ca14f2727-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364722 5094 generic.go:334] "Generic (PLEG): container finished" podID="fbabc907-d404-4942-a4d8-470ca14f2727" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" exitCode=0 Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364753 5094 generic.go:334] "Generic (PLEG): container finished" podID="fbabc907-d404-4942-a4d8-470ca14f2727" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" exitCode=143 Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364805 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364863 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerDied","Data":"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851"} Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerDied","Data":"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0"} Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364915 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fbabc907-d404-4942-a4d8-470ca14f2727","Type":"ContainerDied","Data":"cce65189e72bfca27537c0863492d5e96935f34cfab3ea3bf5ad6c46b32d05c4"} Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.364934 5094 scope.go:117] "RemoveContainer" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.387227 5094 scope.go:117] "RemoveContainer" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.398252 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.406484 5094 scope.go:117] "RemoveContainer" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.406504 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:24 crc kubenswrapper[5094]: E0220 08:43:24.407197 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": container with ID starting with d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851 not found: ID does not exist" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407230 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851"} err="failed to get container status \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": rpc error: code = NotFound desc = could not find container \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": container with ID starting with d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851 not found: ID does not exist" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407251 5094 scope.go:117] "RemoveContainer" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" Feb 20 08:43:24 crc kubenswrapper[5094]: E0220 08:43:24.407503 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": container with ID starting with 85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0 not found: ID does not exist" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407526 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0"} err="failed to get container status \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": rpc error: code = NotFound desc = could not find container \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": container with ID starting with 85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0 not found: ID does not exist" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407543 5094 scope.go:117] "RemoveContainer" containerID="d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407790 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851"} err="failed to get container status \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": rpc error: code = NotFound desc = could not find container \"d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851\": container with ID starting with d67a9d8be447ccddc3de8736b58ddc2f8417e242d7a06e93f5ed96e6442f7851 not found: ID does not exist" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.407817 5094 scope.go:117] "RemoveContainer" containerID="85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.408120 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0"} err="failed to get container status \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": rpc error: code = NotFound desc = could not find container \"85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0\": container with ID starting with 85d5fe70592a4a1df32f836a05e0e11e2adfc67968c3e13d29d0f6bd7acfabd0 not found: ID does not exist" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.426237 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:24 crc kubenswrapper[5094]: E0220 08:43:24.426736 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-log" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.426761 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-log" Feb 20 08:43:24 crc kubenswrapper[5094]: E0220 08:43:24.426779 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-httpd" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.426790 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-httpd" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.427022 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-log" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.427059 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" containerName="glance-httpd" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.428105 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.430112 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.440912 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.543675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.543923 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544038 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544149 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544290 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544381 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.544413 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646094 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646160 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646199 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646256 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646284 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646319 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646342 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646585 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.646854 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.651095 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.651752 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.652450 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.656074 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.663663 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") pod \"glance-default-external-api-0\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " pod="openstack/glance-default-external-api-0" Feb 20 08:43:24 crc kubenswrapper[5094]: I0220 08:43:24.797551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.349465 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:43:25 crc kubenswrapper[5094]: W0220 08:43:25.362158 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0babde66_7106_44f9_8108_dc7123e64645.slice/crio-8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5 WatchSource:0}: Error finding container 8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5: Status 404 returned error can't find the container with id 8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5 Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.375596 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerStarted","Data":"8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5"} Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.375758 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-log" containerID="cri-o://0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" gracePeriod=30 Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.375810 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-httpd" containerID="cri-o://c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" gracePeriod=30 Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.851814 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbabc907-d404-4942-a4d8-470ca14f2727" path="/var/lib/kubelet/pods/fbabc907-d404-4942-a4d8-470ca14f2727/volumes" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.882874 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968630 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968680 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968714 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968782 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968817 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968843 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.968880 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") pod \"a6187d03-61b1-472a-815f-ca42b191010f\" (UID: \"a6187d03-61b1-472a-815f-ca42b191010f\") " Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.969718 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs" (OuterVolumeSpecName: "logs") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.969944 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.975634 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph" (OuterVolumeSpecName: "ceph") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.975922 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts" (OuterVolumeSpecName: "scripts") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:25 crc kubenswrapper[5094]: I0220 08:43:25.976931 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh" (OuterVolumeSpecName: "kube-api-access-zzklh") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "kube-api-access-zzklh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.011838 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.033433 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data" (OuterVolumeSpecName: "config-data") pod "a6187d03-61b1-472a-815f-ca42b191010f" (UID: "a6187d03-61b1-472a-815f-ca42b191010f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070475 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzklh\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-kube-api-access-zzklh\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070523 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070537 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070554 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a6187d03-61b1-472a-815f-ca42b191010f-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070566 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070588 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6187d03-61b1-472a-815f-ca42b191010f-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.070600 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a6187d03-61b1-472a-815f-ca42b191010f-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386465 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6187d03-61b1-472a-815f-ca42b191010f" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" exitCode=0 Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386497 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6187d03-61b1-472a-815f-ca42b191010f" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" exitCode=143 Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386531 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerDied","Data":"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a"} Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerDied","Data":"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce"} Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386567 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a6187d03-61b1-472a-815f-ca42b191010f","Type":"ContainerDied","Data":"49f143470e06e51ff04f2ae879e5f3d29275722915277f2f9c70aa490bec1d6b"} Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386582 5094 scope.go:117] "RemoveContainer" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.386695 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.393396 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerStarted","Data":"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673"} Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.447609 5094 scope.go:117] "RemoveContainer" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.450599 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.465881 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.475410 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:26 crc kubenswrapper[5094]: E0220 08:43:26.475833 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-log" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.475850 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-log" Feb 20 08:43:26 crc kubenswrapper[5094]: E0220 08:43:26.475874 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-httpd" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.475882 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-httpd" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.476030 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-log" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.476050 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6187d03-61b1-472a-815f-ca42b191010f" containerName="glance-httpd" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.476974 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.480283 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.481116 5094 scope.go:117] "RemoveContainer" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" Feb 20 08:43:26 crc kubenswrapper[5094]: E0220 08:43:26.482471 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": container with ID starting with c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a not found: ID does not exist" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.482633 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a"} err="failed to get container status \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": rpc error: code = NotFound desc = could not find container \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": container with ID starting with c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a not found: ID does not exist" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.482823 5094 scope.go:117] "RemoveContainer" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.485114 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:26 crc kubenswrapper[5094]: E0220 08:43:26.499382 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": container with ID starting with 0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce not found: ID does not exist" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.499423 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce"} err="failed to get container status \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": rpc error: code = NotFound desc = could not find container \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": container with ID starting with 0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce not found: ID does not exist" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.499448 5094 scope.go:117] "RemoveContainer" containerID="c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.505045 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a"} err="failed to get container status \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": rpc error: code = NotFound desc = could not find container \"c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a\": container with ID starting with c0d24e589a8fc684f6e5218f814b6c685f1c3cb8ca1162ecb22168768023271a not found: ID does not exist" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.505079 5094 scope.go:117] "RemoveContainer" containerID="0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.505388 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce"} err="failed to get container status \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": rpc error: code = NotFound desc = could not find container \"0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce\": container with ID starting with 0bd7f852eb5be8f60078b5b9762ed67423f20da424c1bd21376a16d31832d6ce not found: ID does not exist" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.581960 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582055 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582099 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582225 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582432 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582493 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.582657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.684192 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.684524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.684686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.684844 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685238 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685350 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685476 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685281 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.685688 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.692336 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.693474 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.693820 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.694177 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.702205 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") pod \"glance-default-internal-api-0\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:43:26 crc kubenswrapper[5094]: I0220 08:43:26.800396 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.359968 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.406675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerStarted","Data":"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54"} Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.408199 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerStarted","Data":"b0e75d749acef441fc02419393d76ceab32d56e244c55548218d0246a2690c4a"} Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.443223 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.443204896 podStartE2EDuration="3.443204896s" podCreationTimestamp="2026-02-20 08:43:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:27.442006448 +0000 UTC m=+7022.314633179" watchObservedRunningTime="2026-02-20 08:43:27.443204896 +0000 UTC m=+7022.315831607" Feb 20 08:43:27 crc kubenswrapper[5094]: I0220 08:43:27.855998 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6187d03-61b1-472a-815f-ca42b191010f" path="/var/lib/kubelet/pods/a6187d03-61b1-472a-815f-ca42b191010f/volumes" Feb 20 08:43:28 crc kubenswrapper[5094]: I0220 08:43:28.419875 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerStarted","Data":"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb"} Feb 20 08:43:28 crc kubenswrapper[5094]: I0220 08:43:28.420224 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerStarted","Data":"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1"} Feb 20 08:43:28 crc kubenswrapper[5094]: I0220 08:43:28.447602 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.44758329 podStartE2EDuration="2.44758329s" podCreationTimestamp="2026-02-20 08:43:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:28.443114903 +0000 UTC m=+7023.315741664" watchObservedRunningTime="2026-02-20 08:43:28.44758329 +0000 UTC m=+7023.320209991" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.021732 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.087622 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.087996 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-549654cbff-7zg62" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" containerID="cri-o://09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09" gracePeriod=10 Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.131028 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-549654cbff-7zg62" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.1.38:5353: connect: connection refused" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.461083 5094 generic.go:334] "Generic (PLEG): container finished" podID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerID="09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09" exitCode=0 Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.461359 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerDied","Data":"09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09"} Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.571307 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.677781 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.677903 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.677933 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.677978 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.678012 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") pod \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\" (UID: \"14b1e378-3ae1-4707-8c14-3ee7ad292a55\") " Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.691032 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc" (OuterVolumeSpecName: "kube-api-access-x9dbc") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "kube-api-access-x9dbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.722909 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.722982 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config" (OuterVolumeSpecName: "config") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.725346 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.726435 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "14b1e378-3ae1-4707-8c14-3ee7ad292a55" (UID: "14b1e378-3ae1-4707-8c14-3ee7ad292a55"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779614 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779650 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779660 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779671 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9dbc\" (UniqueName: \"kubernetes.io/projected/14b1e378-3ae1-4707-8c14-3ee7ad292a55-kube-api-access-x9dbc\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:31 crc kubenswrapper[5094]: I0220 08:43:31.779680 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/14b1e378-3ae1-4707-8c14-3ee7ad292a55-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.486526 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549654cbff-7zg62" event={"ID":"14b1e378-3ae1-4707-8c14-3ee7ad292a55","Type":"ContainerDied","Data":"4b216a605a8b79d7c7ddf65eb3de811aab7c907beb4645b0cfc8890ec094ca61"} Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.486597 5094 scope.go:117] "RemoveContainer" containerID="09b2ae2e83651f531f10fff7937847937cdbc6ce024c4c5e0334ed9056460e09" Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.486850 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549654cbff-7zg62" Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.537667 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.541115 5094 scope.go:117] "RemoveContainer" containerID="a8b2fd7d2d6131025e57a52935b313eca72cf89c2a15b9432a7f1f536f05a672" Feb 20 08:43:32 crc kubenswrapper[5094]: I0220 08:43:32.548167 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-549654cbff-7zg62"] Feb 20 08:43:33 crc kubenswrapper[5094]: I0220 08:43:33.858900 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" path="/var/lib/kubelet/pods/14b1e378-3ae1-4707-8c14-3ee7ad292a55/volumes" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.107394 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.107986 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.108220 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.109778 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.110067 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" gracePeriod=600 Feb 20 08:43:34 crc kubenswrapper[5094]: E0220 08:43:34.243575 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.507987 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" exitCode=0 Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.508077 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec"} Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.508410 5094 scope.go:117] "RemoveContainer" containerID="4eb6a1e784b78bde35c2b1ca39f3fd3b16c3f1975e9f121306145f15615f4afb" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.509010 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:43:34 crc kubenswrapper[5094]: E0220 08:43:34.509298 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.798658 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.798737 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.824206 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 08:43:34 crc kubenswrapper[5094]: I0220 08:43:34.835252 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 08:43:35 crc kubenswrapper[5094]: I0220 08:43:35.525140 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 08:43:35 crc kubenswrapper[5094]: I0220 08:43:35.525178 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 08:43:36 crc kubenswrapper[5094]: I0220 08:43:36.801218 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:36 crc kubenswrapper[5094]: I0220 08:43:36.801487 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:36 crc kubenswrapper[5094]: I0220 08:43:36.829188 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:36 crc kubenswrapper[5094]: I0220 08:43:36.836781 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.093399 5094 scope.go:117] "RemoveContainer" containerID="51c5d24049c628fe72ec29c7d6aad6b1b26637f9d1812b5f27f768d7d83239ed" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.401794 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.418659 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.540553 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:37 crc kubenswrapper[5094]: I0220 08:43:37.540604 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:39 crc kubenswrapper[5094]: I0220 08:43:39.450219 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:39 crc kubenswrapper[5094]: I0220 08:43:39.457650 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 08:43:46 crc kubenswrapper[5094]: I0220 08:43:46.840436 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:43:46 crc kubenswrapper[5094]: E0220 08:43:46.841090 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.088418 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:43:47 crc kubenswrapper[5094]: E0220 08:43:47.088912 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.088932 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" Feb 20 08:43:47 crc kubenswrapper[5094]: E0220 08:43:47.088967 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="init" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.088977 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="init" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.089212 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="14b1e378-3ae1-4707-8c14-3ee7ad292a55" containerName="dnsmasq-dns" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.089936 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.097761 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.198577 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.198818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.199898 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.201851 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.203800 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.209152 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.299850 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.299931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.299958 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.300064 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.300569 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.319366 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") pod \"placement-db-create-r5qjd\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.401157 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.401553 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.402382 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.411751 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.417422 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") pod \"placement-63f8-account-create-update-szqmc\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.545840 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.875797 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:43:47 crc kubenswrapper[5094]: I0220 08:43:47.883805 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.629420 5094 generic.go:334] "Generic (PLEG): container finished" podID="0e67bd4c-454a-4166-9e28-49c348795b29" containerID="9b130573754c208a821c2a5aa00744abfcde1ec2f224d985ae00e81ebcaa218e" exitCode=0 Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.629478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-r5qjd" event={"ID":"0e67bd4c-454a-4166-9e28-49c348795b29","Type":"ContainerDied","Data":"9b130573754c208a821c2a5aa00744abfcde1ec2f224d985ae00e81ebcaa218e"} Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.629864 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-r5qjd" event={"ID":"0e67bd4c-454a-4166-9e28-49c348795b29","Type":"ContainerStarted","Data":"b7b0ed74d0876d26c24341bb21762e9ffb513b9e5d55e63aee8133e9c96863bc"} Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.631997 5094 generic.go:334] "Generic (PLEG): container finished" podID="d5bcef59-b989-4157-8233-6482f9f3abab" containerID="8dfc18891e7f2cecc2e704cc07266d7a47a98f1dcf9f167194c7d37d347b850e" exitCode=0 Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.632049 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-63f8-account-create-update-szqmc" event={"ID":"d5bcef59-b989-4157-8233-6482f9f3abab","Type":"ContainerDied","Data":"8dfc18891e7f2cecc2e704cc07266d7a47a98f1dcf9f167194c7d37d347b850e"} Feb 20 08:43:48 crc kubenswrapper[5094]: I0220 08:43:48.632080 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-63f8-account-create-update-szqmc" event={"ID":"d5bcef59-b989-4157-8233-6482f9f3abab","Type":"ContainerStarted","Data":"e4b2fb567d00953b04fddae3ab249dedcb5e7873615fe6ecf6147543376c3a09"} Feb 20 08:43:49 crc kubenswrapper[5094]: I0220 08:43:49.944104 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.034408 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.047564 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") pod \"d5bcef59-b989-4157-8233-6482f9f3abab\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.047656 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") pod \"d5bcef59-b989-4157-8233-6482f9f3abab\" (UID: \"d5bcef59-b989-4157-8233-6482f9f3abab\") " Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.048307 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5bcef59-b989-4157-8233-6482f9f3abab" (UID: "d5bcef59-b989-4157-8233-6482f9f3abab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.052758 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8" (OuterVolumeSpecName: "kube-api-access-rvlf8") pod "d5bcef59-b989-4157-8233-6482f9f3abab" (UID: "d5bcef59-b989-4157-8233-6482f9f3abab"). InnerVolumeSpecName "kube-api-access-rvlf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149066 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") pod \"0e67bd4c-454a-4166-9e28-49c348795b29\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149184 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") pod \"0e67bd4c-454a-4166-9e28-49c348795b29\" (UID: \"0e67bd4c-454a-4166-9e28-49c348795b29\") " Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149636 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5bcef59-b989-4157-8233-6482f9f3abab-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149656 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvlf8\" (UniqueName: \"kubernetes.io/projected/d5bcef59-b989-4157-8233-6482f9f3abab-kube-api-access-rvlf8\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.149827 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e67bd4c-454a-4166-9e28-49c348795b29" (UID: "0e67bd4c-454a-4166-9e28-49c348795b29"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.153301 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk" (OuterVolumeSpecName: "kube-api-access-p5pgk") pod "0e67bd4c-454a-4166-9e28-49c348795b29" (UID: "0e67bd4c-454a-4166-9e28-49c348795b29"). InnerVolumeSpecName "kube-api-access-p5pgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.251421 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5pgk\" (UniqueName: \"kubernetes.io/projected/0e67bd4c-454a-4166-9e28-49c348795b29-kube-api-access-p5pgk\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.251505 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e67bd4c-454a-4166-9e28-49c348795b29-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.653335 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-r5qjd" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.653336 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-r5qjd" event={"ID":"0e67bd4c-454a-4166-9e28-49c348795b29","Type":"ContainerDied","Data":"b7b0ed74d0876d26c24341bb21762e9ffb513b9e5d55e63aee8133e9c96863bc"} Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.653748 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7b0ed74d0876d26c24341bb21762e9ffb513b9e5d55e63aee8133e9c96863bc" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.655850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-63f8-account-create-update-szqmc" event={"ID":"d5bcef59-b989-4157-8233-6482f9f3abab","Type":"ContainerDied","Data":"e4b2fb567d00953b04fddae3ab249dedcb5e7873615fe6ecf6147543376c3a09"} Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.655930 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4b2fb567d00953b04fddae3ab249dedcb5e7873615fe6ecf6147543376c3a09" Feb 20 08:43:50 crc kubenswrapper[5094]: I0220 08:43:50.655976 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-63f8-account-create-update-szqmc" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.497738 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:43:52 crc kubenswrapper[5094]: E0220 08:43:52.498517 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5bcef59-b989-4157-8233-6482f9f3abab" containerName="mariadb-account-create-update" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.498531 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5bcef59-b989-4157-8233-6482f9f3abab" containerName="mariadb-account-create-update" Feb 20 08:43:52 crc kubenswrapper[5094]: E0220 08:43:52.498546 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e67bd4c-454a-4166-9e28-49c348795b29" containerName="mariadb-database-create" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.498552 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e67bd4c-454a-4166-9e28-49c348795b29" containerName="mariadb-database-create" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.498791 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5bcef59-b989-4157-8233-6482f9f3abab" containerName="mariadb-account-create-update" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.498816 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e67bd4c-454a-4166-9e28-49c348795b29" containerName="mariadb-database-create" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.500840 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.510470 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.534038 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.535249 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.538146 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.538398 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p4pxm" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.538552 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.571893 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595782 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595873 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595914 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595953 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.595985 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596058 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596127 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596160 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596237 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.596289 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698065 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698188 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698211 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698243 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698261 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698306 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698346 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.698368 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703364 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703768 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703913 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.703930 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.719006 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.720675 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.724556 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.742173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") pod \"placement-db-sync-smd54\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " pod="openstack/placement-db-sync-smd54" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.744257 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") pod \"dnsmasq-dns-567db69c47-cctzv\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.821323 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:52 crc kubenswrapper[5094]: I0220 08:43:52.862128 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-smd54" Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.306930 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.357030 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.686576 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-smd54" event={"ID":"b63f3e88-3e2a-43db-88de-8cf778187671","Type":"ContainerStarted","Data":"fcdd175ed55b2d24a7ffffadd8ffe886b1a529efda710241b94c420889488a0d"} Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.689197 5094 generic.go:334] "Generic (PLEG): container finished" podID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerID="e5df72384fdffb4deabf4c6cbb6c43678269bc4fec968e5b22a2e15228a210f5" exitCode=0 Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.689255 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerDied","Data":"e5df72384fdffb4deabf4c6cbb6c43678269bc4fec968e5b22a2e15228a210f5"} Feb 20 08:43:53 crc kubenswrapper[5094]: I0220 08:43:53.689279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerStarted","Data":"ac86d266f2e6539516ad0f20e1cd64fe7ffd8192e43ce128347406606139c997"} Feb 20 08:43:54 crc kubenswrapper[5094]: I0220 08:43:54.712400 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerStarted","Data":"20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72"} Feb 20 08:43:54 crc kubenswrapper[5094]: I0220 08:43:54.712907 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:43:54 crc kubenswrapper[5094]: I0220 08:43:54.735406 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-567db69c47-cctzv" podStartSLOduration=2.7353900810000003 podStartE2EDuration="2.735390081s" podCreationTimestamp="2026-02-20 08:43:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:43:54.729779506 +0000 UTC m=+7049.602406247" watchObservedRunningTime="2026-02-20 08:43:54.735390081 +0000 UTC m=+7049.608016792" Feb 20 08:43:57 crc kubenswrapper[5094]: I0220 08:43:57.738896 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-smd54" event={"ID":"b63f3e88-3e2a-43db-88de-8cf778187671","Type":"ContainerStarted","Data":"3996b425ffaed37f9d76c6d868da173f9ea735a39c30061d9efa5a940e6f7333"} Feb 20 08:43:57 crc kubenswrapper[5094]: I0220 08:43:57.769856 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-smd54" podStartSLOduration=2.61271562 podStartE2EDuration="5.769833854s" podCreationTimestamp="2026-02-20 08:43:52 +0000 UTC" firstStartedPulling="2026-02-20 08:43:53.362141372 +0000 UTC m=+7048.234768073" lastFinishedPulling="2026-02-20 08:43:56.519259596 +0000 UTC m=+7051.391886307" observedRunningTime="2026-02-20 08:43:57.760397077 +0000 UTC m=+7052.633023788" watchObservedRunningTime="2026-02-20 08:43:57.769833854 +0000 UTC m=+7052.642460565" Feb 20 08:43:58 crc kubenswrapper[5094]: I0220 08:43:58.751617 5094 generic.go:334] "Generic (PLEG): container finished" podID="b63f3e88-3e2a-43db-88de-8cf778187671" containerID="3996b425ffaed37f9d76c6d868da173f9ea735a39c30061d9efa5a940e6f7333" exitCode=0 Feb 20 08:43:58 crc kubenswrapper[5094]: I0220 08:43:58.751726 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-smd54" event={"ID":"b63f3e88-3e2a-43db-88de-8cf778187671","Type":"ContainerDied","Data":"3996b425ffaed37f9d76c6d868da173f9ea735a39c30061d9efa5a940e6f7333"} Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.190586 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-smd54" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246208 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246302 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246428 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246509 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.246609 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") pod \"b63f3e88-3e2a-43db-88de-8cf778187671\" (UID: \"b63f3e88-3e2a-43db-88de-8cf778187671\") " Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.247278 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs" (OuterVolumeSpecName: "logs") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.251885 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts" (OuterVolumeSpecName: "scripts") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.251975 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps" (OuterVolumeSpecName: "kube-api-access-mvwps") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "kube-api-access-mvwps". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.272597 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.275022 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data" (OuterVolumeSpecName: "config-data") pod "b63f3e88-3e2a-43db-88de-8cf778187671" (UID: "b63f3e88-3e2a-43db-88de-8cf778187671"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348341 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b63f3e88-3e2a-43db-88de-8cf778187671-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348375 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348385 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvwps\" (UniqueName: \"kubernetes.io/projected/b63f3e88-3e2a-43db-88de-8cf778187671-kube-api-access-mvwps\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348396 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.348404 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b63f3e88-3e2a-43db-88de-8cf778187671-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.776660 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-smd54" event={"ID":"b63f3e88-3e2a-43db-88de-8cf778187671","Type":"ContainerDied","Data":"fcdd175ed55b2d24a7ffffadd8ffe886b1a529efda710241b94c420889488a0d"} Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.776697 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcdd175ed55b2d24a7ffffadd8ffe886b1a529efda710241b94c420889488a0d" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.776696 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-smd54" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.850737 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-64d8d4f69d-shjqs"] Feb 20 08:44:00 crc kubenswrapper[5094]: E0220 08:44:00.851461 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63f3e88-3e2a-43db-88de-8cf778187671" containerName="placement-db-sync" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.851482 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63f3e88-3e2a-43db-88de-8cf778187671" containerName="placement-db-sync" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.851677 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63f3e88-3e2a-43db-88de-8cf778187671" containerName="placement-db-sync" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.852745 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.854952 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-p4pxm" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.855387 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.856374 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.898650 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64d8d4f69d-shjqs"] Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958359 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-config-data\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958640 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-combined-ca-bundle\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958698 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-scripts\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958819 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfwwk\" (UniqueName: \"kubernetes.io/projected/cc482d5b-0b27-4293-b02b-7b02007cf790-kube-api-access-hfwwk\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:00 crc kubenswrapper[5094]: I0220 08:44:00.958932 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc482d5b-0b27-4293-b02b-7b02007cf790-logs\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.059948 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc482d5b-0b27-4293-b02b-7b02007cf790-logs\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-config-data\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060084 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-combined-ca-bundle\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060104 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-scripts\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060138 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfwwk\" (UniqueName: \"kubernetes.io/projected/cc482d5b-0b27-4293-b02b-7b02007cf790-kube-api-access-hfwwk\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.060417 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc482d5b-0b27-4293-b02b-7b02007cf790-logs\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.063620 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-scripts\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.064208 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-config-data\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.074756 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc482d5b-0b27-4293-b02b-7b02007cf790-combined-ca-bundle\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.078188 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfwwk\" (UniqueName: \"kubernetes.io/projected/cc482d5b-0b27-4293-b02b-7b02007cf790-kube-api-access-hfwwk\") pod \"placement-64d8d4f69d-shjqs\" (UID: \"cc482d5b-0b27-4293-b02b-7b02007cf790\") " pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.195309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.637814 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-64d8d4f69d-shjqs"] Feb 20 08:44:01 crc kubenswrapper[5094]: W0220 08:44:01.644984 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc482d5b_0b27_4293_b02b_7b02007cf790.slice/crio-bb1cd92eceed647f6ac1a213eabeececf52d940c826cde33a913bf1a19950dba WatchSource:0}: Error finding container bb1cd92eceed647f6ac1a213eabeececf52d940c826cde33a913bf1a19950dba: Status 404 returned error can't find the container with id bb1cd92eceed647f6ac1a213eabeececf52d940c826cde33a913bf1a19950dba Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.785354 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d8d4f69d-shjqs" event={"ID":"cc482d5b-0b27-4293-b02b-7b02007cf790","Type":"ContainerStarted","Data":"bb1cd92eceed647f6ac1a213eabeececf52d940c826cde33a913bf1a19950dba"} Feb 20 08:44:01 crc kubenswrapper[5094]: I0220 08:44:01.839842 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:01 crc kubenswrapper[5094]: E0220 08:44:01.840150 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.811389 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d8d4f69d-shjqs" event={"ID":"cc482d5b-0b27-4293-b02b-7b02007cf790","Type":"ContainerStarted","Data":"87e44785505a7c4d57850817ba0c257b31f219096005591405831fb50885168d"} Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.812890 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.812994 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.813074 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-64d8d4f69d-shjqs" event={"ID":"cc482d5b-0b27-4293-b02b-7b02007cf790","Type":"ContainerStarted","Data":"ec1b8e7fa8db3b20e94f7aa0892926f5a97eee09b9575ed017cd8f1b46da622a"} Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.822900 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.842476 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-64d8d4f69d-shjqs" podStartSLOduration=2.8424556 podStartE2EDuration="2.8424556s" podCreationTimestamp="2026-02-20 08:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:44:02.832112182 +0000 UTC m=+7057.704738923" watchObservedRunningTime="2026-02-20 08:44:02.8424556 +0000 UTC m=+7057.715082311" Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.916028 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:44:02 crc kubenswrapper[5094]: I0220 08:44:02.916563 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="dnsmasq-dns" containerID="cri-o://913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" gracePeriod=10 Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.394806 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.506932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.506995 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.507030 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.507056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.507085 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") pod \"84d7a949-86f8-4325-8494-1e37848e76ec\" (UID: \"84d7a949-86f8-4325-8494-1e37848e76ec\") " Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.523777 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp" (OuterVolumeSpecName: "kube-api-access-b4kpp") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "kube-api-access-b4kpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.549289 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.549352 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.561650 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config" (OuterVolumeSpecName: "config") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.562216 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84d7a949-86f8-4325-8494-1e37848e76ec" (UID: "84d7a949-86f8-4325-8494-1e37848e76ec"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608423 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4kpp\" (UniqueName: \"kubernetes.io/projected/84d7a949-86f8-4325-8494-1e37848e76ec-kube-api-access-b4kpp\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608448 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608457 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608466 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.608473 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84d7a949-86f8-4325-8494-1e37848e76ec-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.821957 5094 generic.go:334] "Generic (PLEG): container finished" podID="84d7a949-86f8-4325-8494-1e37848e76ec" containerID="913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" exitCode=0 Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.822822 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.825870 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerDied","Data":"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f"} Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.825912 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b495df7c5-7nbzn" event={"ID":"84d7a949-86f8-4325-8494-1e37848e76ec","Type":"ContainerDied","Data":"9b5aa36d351d8e72474f8ef575d86e027a54c562714317f08d7c4cf38067fb2a"} Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.825941 5094 scope.go:117] "RemoveContainer" containerID="913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.857949 5094 scope.go:117] "RemoveContainer" containerID="208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.864789 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.875831 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b495df7c5-7nbzn"] Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.894295 5094 scope.go:117] "RemoveContainer" containerID="913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" Feb 20 08:44:03 crc kubenswrapper[5094]: E0220 08:44:03.894614 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f\": container with ID starting with 913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f not found: ID does not exist" containerID="913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.894650 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f"} err="failed to get container status \"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f\": rpc error: code = NotFound desc = could not find container \"913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f\": container with ID starting with 913c46cb53746a67114f39304fb82db0087e728a7ec5b6b477343f5f416a8b8f not found: ID does not exist" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.894669 5094 scope.go:117] "RemoveContainer" containerID="208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100" Feb 20 08:44:03 crc kubenswrapper[5094]: E0220 08:44:03.895008 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100\": container with ID starting with 208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100 not found: ID does not exist" containerID="208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100" Feb 20 08:44:03 crc kubenswrapper[5094]: I0220 08:44:03.895028 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100"} err="failed to get container status \"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100\": rpc error: code = NotFound desc = could not find container \"208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100\": container with ID starting with 208d3211ae79c8d4cda59830631f10fba8107fe5fff0ad3a2d8a3afdcb25b100 not found: ID does not exist" Feb 20 08:44:05 crc kubenswrapper[5094]: I0220 08:44:05.865785 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" path="/var/lib/kubelet/pods/84d7a949-86f8-4325-8494-1e37848e76ec/volumes" Feb 20 08:44:13 crc kubenswrapper[5094]: I0220 08:44:13.840415 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:13 crc kubenswrapper[5094]: E0220 08:44:13.841314 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:28 crc kubenswrapper[5094]: I0220 08:44:28.840104 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:28 crc kubenswrapper[5094]: E0220 08:44:28.840801 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:32 crc kubenswrapper[5094]: I0220 08:44:32.226935 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:32 crc kubenswrapper[5094]: I0220 08:44:32.229768 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-64d8d4f69d-shjqs" Feb 20 08:44:37 crc kubenswrapper[5094]: I0220 08:44:37.199303 5094 scope.go:117] "RemoveContainer" containerID="63ebc0843e1f18b14a59e973034594be94e63f23e4b14fd38a292a888d5971cd" Feb 20 08:44:37 crc kubenswrapper[5094]: I0220 08:44:37.229188 5094 scope.go:117] "RemoveContainer" containerID="c629842f4857fc824451a2868ac4ca4ec36a7daeeb169573db6f9b5405d05a43" Feb 20 08:44:39 crc kubenswrapper[5094]: I0220 08:44:39.839817 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:39 crc kubenswrapper[5094]: E0220 08:44:39.840382 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:53 crc kubenswrapper[5094]: I0220 08:44:53.843483 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:44:53 crc kubenswrapper[5094]: E0220 08:44:53.844363 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.046652 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:44:56 crc kubenswrapper[5094]: E0220 08:44:56.047547 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="dnsmasq-dns" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.047565 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="dnsmasq-dns" Feb 20 08:44:56 crc kubenswrapper[5094]: E0220 08:44:56.047589 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="init" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.047596 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="init" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.047774 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d7a949-86f8-4325-8494-1e37848e76ec" containerName="dnsmasq-dns" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.048346 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.053257 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.084852 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.084930 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.143170 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.145638 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.152229 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.191334 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.191418 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.191478 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.191508 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.197724 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.221074 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") pod \"nova-api-db-create-q8cvq\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.264237 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.265692 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.267676 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.292136 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.293724 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.293872 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.294587 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.314453 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") pod \"nova-cell0-db-create-qwnhp\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.369072 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.372365 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.397749 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.397863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.404018 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.404114 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.462163 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.474689 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.476111 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.483114 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.490351 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.499571 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.499619 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.499653 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.499675 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.500613 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.566969 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") pod \"nova-api-1c22-account-create-update-wl44h\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.586067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.600662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.600728 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.600771 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.600789 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.601419 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.622844 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") pod \"nova-cell1-db-create-vjfql\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.657157 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.658509 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.673308 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.674321 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.702174 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.702232 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.702259 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.702283 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.703178 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.719016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") pod \"nova-cell0-9129-account-create-update-kcqsj\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.804014 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.804082 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.804700 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.813688 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.841194 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") pod \"nova-cell1-437f-account-create-update-cc6d4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.890329 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:56 crc kubenswrapper[5094]: I0220 08:44:56.989179 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.033873 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.064025 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ab2f8a8_e11c_4b13_a12f_7006756e4d56.slice/crio-71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71 WatchSource:0}: Error finding container 71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71: Status 404 returned error can't find the container with id 71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71 Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.150469 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.205716 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.217920 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95274e98_2b48_4b4d_b0c5_5dedafedc43f.slice/crio-1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96 WatchSource:0}: Error finding container 1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96: Status 404 returned error can't find the container with id 1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96 Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.295795 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qwnhp" event={"ID":"62afc590-4a32-45a1-b7e9-bde09c7f0b6a","Type":"ContainerStarted","Data":"3402431af5c0b829ac8fa80c190f9199ba60a6730114334bae45f899a11cb0fe"} Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.300125 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8cvq" event={"ID":"2ab2f8a8-e11c-4b13-a12f-7006756e4d56","Type":"ContainerStarted","Data":"18a0fc5e2df223fc2c55d1f16cc18e7c5d24a21c6534f46e5e010adfb875a921"} Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.300168 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8cvq" event={"ID":"2ab2f8a8-e11c-4b13-a12f-7006756e4d56","Type":"ContainerStarted","Data":"71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71"} Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.305350 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1c22-account-create-update-wl44h" event={"ID":"95274e98-2b48-4b4d-b0c5-5dedafedc43f","Type":"ContainerStarted","Data":"1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96"} Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.319791 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-q8cvq" podStartSLOduration=1.319770209 podStartE2EDuration="1.319770209s" podCreationTimestamp="2026-02-20 08:44:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:44:57.316301355 +0000 UTC m=+7112.188928056" watchObservedRunningTime="2026-02-20 08:44:57.319770209 +0000 UTC m=+7112.192396920" Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.342325 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode24ca1b9_7440_432c_a0eb_58a17f83a8ee.slice/crio-a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2 WatchSource:0}: Error finding container a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2: Status 404 returned error can't find the container with id a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2 Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.373221 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.506626 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.549926 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90396e9c_2602_41dd_92c3_da38bb5f7be7.slice/crio-e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c WatchSource:0}: Error finding container e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c: Status 404 returned error can't find the container with id e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c Feb 20 08:44:57 crc kubenswrapper[5094]: I0220 08:44:57.624651 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:44:57 crc kubenswrapper[5094]: W0220 08:44:57.635121 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33e893dc_597d_4b0d_b59d_04c636d58ce4.slice/crio-9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf WatchSource:0}: Error finding container 9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf: Status 404 returned error can't find the container with id 9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.314395 5094 generic.go:334] "Generic (PLEG): container finished" podID="33e893dc-597d-4b0d-b59d-04c636d58ce4" containerID="3f997facacb6313e0f115f2a2227ee22f54c84973cc33a1b4f4cc4cd0e2df3df" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.314501 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" event={"ID":"33e893dc-597d-4b0d-b59d-04c636d58ce4","Type":"ContainerDied","Data":"3f997facacb6313e0f115f2a2227ee22f54c84973cc33a1b4f4cc4cd0e2df3df"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.314567 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" event={"ID":"33e893dc-597d-4b0d-b59d-04c636d58ce4","Type":"ContainerStarted","Data":"9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.316242 5094 generic.go:334] "Generic (PLEG): container finished" podID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" containerID="18a0fc5e2df223fc2c55d1f16cc18e7c5d24a21c6534f46e5e010adfb875a921" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.316279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8cvq" event={"ID":"2ab2f8a8-e11c-4b13-a12f-7006756e4d56","Type":"ContainerDied","Data":"18a0fc5e2df223fc2c55d1f16cc18e7c5d24a21c6534f46e5e010adfb875a921"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.317996 5094 generic.go:334] "Generic (PLEG): container finished" podID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" containerID="0d114a7c88828f83388e2e035f175ac9a3e4b92dd7429d32fee56582784e51b6" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.318049 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1c22-account-create-update-wl44h" event={"ID":"95274e98-2b48-4b4d-b0c5-5dedafedc43f","Type":"ContainerDied","Data":"0d114a7c88828f83388e2e035f175ac9a3e4b92dd7429d32fee56582784e51b6"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.321365 5094 generic.go:334] "Generic (PLEG): container finished" podID="90396e9c-2602-41dd-92c3-da38bb5f7be7" containerID="814c099e47197bd5868d74e553deb48d652a97c496a27496d27f367ca0750674" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.321461 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" event={"ID":"90396e9c-2602-41dd-92c3-da38bb5f7be7","Type":"ContainerDied","Data":"814c099e47197bd5868d74e553deb48d652a97c496a27496d27f367ca0750674"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.321498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" event={"ID":"90396e9c-2602-41dd-92c3-da38bb5f7be7","Type":"ContainerStarted","Data":"e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.324964 5094 generic.go:334] "Generic (PLEG): container finished" podID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" containerID="7d035de1d36dadcbc2b1699a2d04fbaf8dc66a5156f2934f3122a703497829c7" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.325081 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vjfql" event={"ID":"e24ca1b9-7440-432c-a0eb-58a17f83a8ee","Type":"ContainerDied","Data":"7d035de1d36dadcbc2b1699a2d04fbaf8dc66a5156f2934f3122a703497829c7"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.325137 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vjfql" event={"ID":"e24ca1b9-7440-432c-a0eb-58a17f83a8ee","Type":"ContainerStarted","Data":"a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2"} Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.327209 5094 generic.go:334] "Generic (PLEG): container finished" podID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" containerID="b22a4c98fab8bd430cea1082edfc23c911f8d32bd3adc55526aec0a42c5684bd" exitCode=0 Feb 20 08:44:58 crc kubenswrapper[5094]: I0220 08:44:58.327274 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qwnhp" event={"ID":"62afc590-4a32-45a1-b7e9-bde09c7f0b6a","Type":"ContainerDied","Data":"b22a4c98fab8bd430cea1082edfc23c911f8d32bd3adc55526aec0a42c5684bd"} Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.771108 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.861412 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") pod \"33e893dc-597d-4b0d-b59d-04c636d58ce4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.861652 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") pod \"33e893dc-597d-4b0d-b59d-04c636d58ce4\" (UID: \"33e893dc-597d-4b0d-b59d-04c636d58ce4\") " Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.862122 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33e893dc-597d-4b0d-b59d-04c636d58ce4" (UID: "33e893dc-597d-4b0d-b59d-04c636d58ce4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.866882 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79" (OuterVolumeSpecName: "kube-api-access-w4b79") pod "33e893dc-597d-4b0d-b59d-04c636d58ce4" (UID: "33e893dc-597d-4b0d-b59d-04c636d58ce4"). InnerVolumeSpecName "kube-api-access-w4b79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.942850 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.949228 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.958603 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.963539 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33e893dc-597d-4b0d-b59d-04c636d58ce4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.963681 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4b79\" (UniqueName: \"kubernetes.io/projected/33e893dc-597d-4b0d-b59d-04c636d58ce4-kube-api-access-w4b79\") on node \"crc\" DevicePath \"\"" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.964828 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:44:59 crc kubenswrapper[5094]: I0220 08:44:59.979987 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") pod \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065359 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") pod \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065399 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") pod \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065425 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") pod \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065461 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") pod \"90396e9c-2602-41dd-92c3-da38bb5f7be7\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065489 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") pod \"90396e9c-2602-41dd-92c3-da38bb5f7be7\" (UID: \"90396e9c-2602-41dd-92c3-da38bb5f7be7\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065546 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") pod \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\" (UID: \"2ab2f8a8-e11c-4b13-a12f-7006756e4d56\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065588 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") pod \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\" (UID: \"95274e98-2b48-4b4d-b0c5-5dedafedc43f\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065648 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") pod \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\" (UID: \"e24ca1b9-7440-432c-a0eb-58a17f83a8ee\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065735 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") pod \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\" (UID: \"62afc590-4a32-45a1-b7e9-bde09c7f0b6a\") " Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065897 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62afc590-4a32-45a1-b7e9-bde09c7f0b6a" (UID: "62afc590-4a32-45a1-b7e9-bde09c7f0b6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.065905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95274e98-2b48-4b4d-b0c5-5dedafedc43f" (UID: "95274e98-2b48-4b4d-b0c5-5dedafedc43f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.066172 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95274e98-2b48-4b4d-b0c5-5dedafedc43f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.066192 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.066269 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ab2f8a8-e11c-4b13-a12f-7006756e4d56" (UID: "2ab2f8a8-e11c-4b13-a12f-7006756e4d56"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.067865 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90396e9c-2602-41dd-92c3-da38bb5f7be7" (UID: "90396e9c-2602-41dd-92c3-da38bb5f7be7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.069252 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e24ca1b9-7440-432c-a0eb-58a17f83a8ee" (UID: "e24ca1b9-7440-432c-a0eb-58a17f83a8ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070046 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857" (OuterVolumeSpecName: "kube-api-access-7r857") pod "2ab2f8a8-e11c-4b13-a12f-7006756e4d56" (UID: "2ab2f8a8-e11c-4b13-a12f-7006756e4d56"). InnerVolumeSpecName "kube-api-access-7r857". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070129 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld" (OuterVolumeSpecName: "kube-api-access-hsmld") pod "e24ca1b9-7440-432c-a0eb-58a17f83a8ee" (UID: "e24ca1b9-7440-432c-a0eb-58a17f83a8ee"). InnerVolumeSpecName "kube-api-access-hsmld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk" (OuterVolumeSpecName: "kube-api-access-jstfk") pod "90396e9c-2602-41dd-92c3-da38bb5f7be7" (UID: "90396e9c-2602-41dd-92c3-da38bb5f7be7"). InnerVolumeSpecName "kube-api-access-jstfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6" (OuterVolumeSpecName: "kube-api-access-jkjz6") pod "62afc590-4a32-45a1-b7e9-bde09c7f0b6a" (UID: "62afc590-4a32-45a1-b7e9-bde09c7f0b6a"). InnerVolumeSpecName "kube-api-access-jkjz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.070868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm" (OuterVolumeSpecName: "kube-api-access-mmkdm") pod "95274e98-2b48-4b4d-b0c5-5dedafedc43f" (UID: "95274e98-2b48-4b4d-b0c5-5dedafedc43f"). InnerVolumeSpecName "kube-api-access-mmkdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.134426 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.134912 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.134930 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.134948 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33e893dc-597d-4b0d-b59d-04c636d58ce4" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.134955 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="33e893dc-597d-4b0d-b59d-04c636d58ce4" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.134978 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.134984 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.134998 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135003 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.135017 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90396e9c-2602-41dd-92c3-da38bb5f7be7" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135023 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="90396e9c-2602-41dd-92c3-da38bb5f7be7" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: E0220 08:45:00.135034 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135040 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135206 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="90396e9c-2602-41dd-92c3-da38bb5f7be7" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135215 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135225 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135235 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135248 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" containerName="mariadb-database-create" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135259 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="33e893dc-597d-4b0d-b59d-04c636d58ce4" containerName="mariadb-account-create-update" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.135893 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.139767 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.140952 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.141980 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.167722 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.167820 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.167894 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.167996 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168009 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168019 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90396e9c-2602-41dd-92c3-da38bb5f7be7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168028 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jstfk\" (UniqueName: \"kubernetes.io/projected/90396e9c-2602-41dd-92c3-da38bb5f7be7-kube-api-access-jstfk\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168038 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r857\" (UniqueName: \"kubernetes.io/projected/2ab2f8a8-e11c-4b13-a12f-7006756e4d56-kube-api-access-7r857\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168047 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmkdm\" (UniqueName: \"kubernetes.io/projected/95274e98-2b48-4b4d-b0c5-5dedafedc43f-kube-api-access-mmkdm\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168056 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsmld\" (UniqueName: \"kubernetes.io/projected/e24ca1b9-7440-432c-a0eb-58a17f83a8ee-kube-api-access-hsmld\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.168094 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkjz6\" (UniqueName: \"kubernetes.io/projected/62afc590-4a32-45a1-b7e9-bde09c7f0b6a-kube-api-access-jkjz6\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.269359 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.269631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.269790 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.270558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.272636 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.285434 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") pod \"collect-profiles-29526285-265c4\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.359064 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1c22-account-create-update-wl44h" event={"ID":"95274e98-2b48-4b4d-b0c5-5dedafedc43f","Type":"ContainerDied","Data":"1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.359116 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dbf96c9c2fb9c9c59994ebb1568bc2fa160b27d8989791e349374e858cc8c96" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.359092 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1c22-account-create-update-wl44h" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.360631 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.360675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-9129-account-create-update-kcqsj" event={"ID":"90396e9c-2602-41dd-92c3-da38bb5f7be7","Type":"ContainerDied","Data":"e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.360794 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4db54b07b51e441746c30da24b69b2b5a330b4c4da10e88c7e0cad568ec5b9c" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.362600 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-vjfql" event={"ID":"e24ca1b9-7440-432c-a0eb-58a17f83a8ee","Type":"ContainerDied","Data":"a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.362858 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a618dc7847d0c1e997e5258ecd34df71b91c9ca9712f96fe9bce654865ed51d2" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.362618 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-vjfql" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.365039 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-qwnhp" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.365043 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-qwnhp" event={"ID":"62afc590-4a32-45a1-b7e9-bde09c7f0b6a","Type":"ContainerDied","Data":"3402431af5c0b829ac8fa80c190f9199ba60a6730114334bae45f899a11cb0fe"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.365154 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3402431af5c0b829ac8fa80c190f9199ba60a6730114334bae45f899a11cb0fe" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.367172 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" event={"ID":"33e893dc-597d-4b0d-b59d-04c636d58ce4","Type":"ContainerDied","Data":"9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.367200 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dec53993c58a49e10cba3dda4698b5b9362b895ed9d44eabdf14e9fd3345aaf" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.367269 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-437f-account-create-update-cc6d4" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.369047 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q8cvq" event={"ID":"2ab2f8a8-e11c-4b13-a12f-7006756e4d56","Type":"ContainerDied","Data":"71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71"} Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.369068 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71d06dd390ead65459118c430e3e92cf62decfb5df76c8b5366da5fbbcb16c71" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.369110 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q8cvq" Feb 20 08:45:00 crc kubenswrapper[5094]: I0220 08:45:00.457465 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.041995 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.380194 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" event={"ID":"4271712d-7fb9-4862-bc38-e3cfbcced425","Type":"ContainerStarted","Data":"8db4ee156703861a72d9f8f5a2380b086eb9a7f4aadd08037037563019ebbb48"} Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.380524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" event={"ID":"4271712d-7fb9-4862-bc38-e3cfbcced425","Type":"ContainerStarted","Data":"05446f0e59e5eb589aee3a0f185fa7c010261fa51c4c3bcc0680f6fef958ea2c"} Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.406140 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" podStartSLOduration=1.406119283 podStartE2EDuration="1.406119283s" podCreationTimestamp="2026-02-20 08:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:01.405658992 +0000 UTC m=+7116.278285703" watchObservedRunningTime="2026-02-20 08:45:01.406119283 +0000 UTC m=+7116.278745994" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.852943 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.853882 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.853962 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.856231 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2cnhr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.856451 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.856735 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.895695 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.895805 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.895830 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.895869 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.997419 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.997541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.997575 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:01 crc kubenswrapper[5094]: I0220 08:45:01.997636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.004995 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.005332 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.005773 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.029544 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") pod \"nova-cell0-conductor-db-sync-mkwlr\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.181108 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.391817 5094 generic.go:334] "Generic (PLEG): container finished" podID="4271712d-7fb9-4862-bc38-e3cfbcced425" containerID="8db4ee156703861a72d9f8f5a2380b086eb9a7f4aadd08037037563019ebbb48" exitCode=0 Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.391869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" event={"ID":"4271712d-7fb9-4862-bc38-e3cfbcced425","Type":"ContainerDied","Data":"8db4ee156703861a72d9f8f5a2380b086eb9a7f4aadd08037037563019ebbb48"} Feb 20 08:45:02 crc kubenswrapper[5094]: I0220 08:45:02.640503 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:45:02 crc kubenswrapper[5094]: W0220 08:45:02.641073 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8cc89fb_1ef5_4f62_afbc_a06a1d75fa91.slice/crio-3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532 WatchSource:0}: Error finding container 3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532: Status 404 returned error can't find the container with id 3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532 Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.399354 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" event={"ID":"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91","Type":"ContainerStarted","Data":"3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532"} Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.734050 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.828512 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") pod \"4271712d-7fb9-4862-bc38-e3cfbcced425\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.828878 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") pod \"4271712d-7fb9-4862-bc38-e3cfbcced425\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.828945 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") pod \"4271712d-7fb9-4862-bc38-e3cfbcced425\" (UID: \"4271712d-7fb9-4862-bc38-e3cfbcced425\") " Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.829497 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume" (OuterVolumeSpecName: "config-volume") pod "4271712d-7fb9-4862-bc38-e3cfbcced425" (UID: "4271712d-7fb9-4862-bc38-e3cfbcced425"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.834229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx" (OuterVolumeSpecName: "kube-api-access-5c9mx") pod "4271712d-7fb9-4862-bc38-e3cfbcced425" (UID: "4271712d-7fb9-4862-bc38-e3cfbcced425"). InnerVolumeSpecName "kube-api-access-5c9mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.838860 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4271712d-7fb9-4862-bc38-e3cfbcced425" (UID: "4271712d-7fb9-4862-bc38-e3cfbcced425"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.930208 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4271712d-7fb9-4862-bc38-e3cfbcced425-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.930238 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4271712d-7fb9-4862-bc38-e3cfbcced425-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:03 crc kubenswrapper[5094]: I0220 08:45:03.930250 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c9mx\" (UniqueName: \"kubernetes.io/projected/4271712d-7fb9-4862-bc38-e3cfbcced425-kube-api-access-5c9mx\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.414363 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" event={"ID":"4271712d-7fb9-4862-bc38-e3cfbcced425","Type":"ContainerDied","Data":"05446f0e59e5eb589aee3a0f185fa7c010261fa51c4c3bcc0680f6fef958ea2c"} Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.414412 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05446f0e59e5eb589aee3a0f185fa7c010261fa51c4c3bcc0680f6fef958ea2c" Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.414477 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4" Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.496682 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:45:04 crc kubenswrapper[5094]: I0220 08:45:04.504141 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526240-s96td"] Feb 20 08:45:05 crc kubenswrapper[5094]: I0220 08:45:05.853181 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a036c1c3-0425-4a2e-a42d-2abfcdc49620" path="/var/lib/kubelet/pods/a036c1c3-0425-4a2e-a42d-2abfcdc49620/volumes" Feb 20 08:45:07 crc kubenswrapper[5094]: I0220 08:45:07.841039 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:45:07 crc kubenswrapper[5094]: E0220 08:45:07.841300 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:45:11 crc kubenswrapper[5094]: I0220 08:45:11.466357 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" event={"ID":"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91","Type":"ContainerStarted","Data":"cc26980783bd1f2aa717556354f0e4a6d1d7792f5dd61a8146cad63d1f649ba5"} Feb 20 08:45:11 crc kubenswrapper[5094]: I0220 08:45:11.486198 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" podStartSLOduration=2.327103886 podStartE2EDuration="10.486180277s" podCreationTimestamp="2026-02-20 08:45:01 +0000 UTC" firstStartedPulling="2026-02-20 08:45:02.643002502 +0000 UTC m=+7117.515629223" lastFinishedPulling="2026-02-20 08:45:10.802078903 +0000 UTC m=+7125.674705614" observedRunningTime="2026-02-20 08:45:11.484565858 +0000 UTC m=+7126.357192569" watchObservedRunningTime="2026-02-20 08:45:11.486180277 +0000 UTC m=+7126.358806988" Feb 20 08:45:16 crc kubenswrapper[5094]: I0220 08:45:16.508730 5094 generic.go:334] "Generic (PLEG): container finished" podID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" containerID="cc26980783bd1f2aa717556354f0e4a6d1d7792f5dd61a8146cad63d1f649ba5" exitCode=0 Feb 20 08:45:16 crc kubenswrapper[5094]: I0220 08:45:16.508767 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" event={"ID":"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91","Type":"ContainerDied","Data":"cc26980783bd1f2aa717556354f0e4a6d1d7792f5dd61a8146cad63d1f649ba5"} Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.800159 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.959139 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") pod \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.959255 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") pod \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.959316 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") pod \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.959363 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") pod \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\" (UID: \"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91\") " Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.964718 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts" (OuterVolumeSpecName: "scripts") pod "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" (UID: "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.967976 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425" (OuterVolumeSpecName: "kube-api-access-xb425") pod "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" (UID: "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91"). InnerVolumeSpecName "kube-api-access-xb425". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:17 crc kubenswrapper[5094]: I0220 08:45:17.987692 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data" (OuterVolumeSpecName: "config-data") pod "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" (UID: "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.001753 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" (UID: "c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.061196 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.061236 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb425\" (UniqueName: \"kubernetes.io/projected/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-kube-api-access-xb425\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.061249 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.061257 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.525083 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" event={"ID":"c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91","Type":"ContainerDied","Data":"3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532"} Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.525131 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mkwlr" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.525131 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3514502e9d5373371a4ad5a129f3f7468d51d10eae032c22c932337ebad3e532" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.645596 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:45:18 crc kubenswrapper[5094]: E0220 08:45:18.646014 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" containerName="nova-cell0-conductor-db-sync" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.646032 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" containerName="nova-cell0-conductor-db-sync" Feb 20 08:45:18 crc kubenswrapper[5094]: E0220 08:45:18.646043 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4271712d-7fb9-4862-bc38-e3cfbcced425" containerName="collect-profiles" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.646052 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4271712d-7fb9-4862-bc38-e3cfbcced425" containerName="collect-profiles" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.646237 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4271712d-7fb9-4862-bc38-e3cfbcced425" containerName="collect-profiles" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.646255 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" containerName="nova-cell0-conductor-db-sync" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.649441 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.652034 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2cnhr" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.652222 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.656899 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.775168 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.775612 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.775775 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.877117 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.877213 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.877316 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.882119 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.882287 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.900547 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") pod \"nova-cell0-conductor-0\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:18 crc kubenswrapper[5094]: I0220 08:45:18.970965 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:19 crc kubenswrapper[5094]: I0220 08:45:19.423679 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:45:19 crc kubenswrapper[5094]: I0220 08:45:19.540827 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca","Type":"ContainerStarted","Data":"ed4718ea392d6b6de8bbaf21e72aac2f90b8c262b455dfd4b484e4049f29e229"} Feb 20 08:45:19 crc kubenswrapper[5094]: I0220 08:45:19.839685 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:45:19 crc kubenswrapper[5094]: E0220 08:45:19.839935 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:45:20 crc kubenswrapper[5094]: I0220 08:45:20.551979 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca","Type":"ContainerStarted","Data":"e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58"} Feb 20 08:45:20 crc kubenswrapper[5094]: I0220 08:45:20.553171 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:20 crc kubenswrapper[5094]: I0220 08:45:20.577170 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.577156372 podStartE2EDuration="2.577156372s" podCreationTimestamp="2026-02-20 08:45:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:20.569805506 +0000 UTC m=+7135.442432217" watchObservedRunningTime="2026-02-20 08:45:20.577156372 +0000 UTC m=+7135.449783083" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.004422 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.451680 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.453748 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.460998 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.461979 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.462042 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.572606 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.572725 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.572783 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.572812 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.603656 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.605225 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.607151 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.614952 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.616941 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.620084 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.637253 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.653386 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.676955 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.677025 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.677063 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.677084 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.690269 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.690754 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.694309 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.709754 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") pod \"nova-cell0-cell-mapping-62kch\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.721021 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.722660 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.728266 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.746851 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.780436 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.780949 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781223 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781587 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.781777 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.798032 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.806226 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.807961 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.810975 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.839943 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.887817 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888170 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888273 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888332 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888370 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888444 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888614 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.888661 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.902552 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.903844 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.904634 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.918743 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.929576 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.931482 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") pod \"nova-api-0\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.935056 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") pod \"nova-cell1-novncproxy-0\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.940746 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.964775 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.970100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.976910 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990487 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990538 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990563 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990592 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990672 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.990697 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.991273 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:29 crc kubenswrapper[5094]: I0220 08:45:29.996391 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.001893 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.014847 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") pod \"nova-scheduler-0\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.092113 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096066 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096152 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096200 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096240 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096271 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096304 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096338 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096365 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.096411 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.098827 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.103018 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.115359 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.129832 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") pod \"nova-metadata-0\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.176920 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.198855 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.198918 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.198986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.199031 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.199066 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.199926 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.200532 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.200857 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.201118 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.219744 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") pod \"dnsmasq-dns-9dcb44685-h54hc\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.225721 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.298315 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.426667 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.510689 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.512475 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.518643 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.518856 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.525413 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:45:30 crc kubenswrapper[5094]: W0220 08:45:30.543849 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59f04fe3_56d8_4fcb_a1bf_35b730bd7d89.slice/crio-9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e WatchSource:0}: Error finding container 9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e: Status 404 returned error can't find the container with id 9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.554861 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.607772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.607821 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.607863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.607886 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.652762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerStarted","Data":"9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e"} Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.658138 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-62kch" event={"ID":"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3","Type":"ContainerStarted","Data":"a81d372de4e3c666bafe4f07d24f6f7cc4bee3b83fd66c5127214476a5ecb65e"} Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.658169 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-62kch" event={"ID":"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3","Type":"ContainerStarted","Data":"4451dcbad7b501ecea1158230334701023d79a8cdb69845458a3f3fcb82a0bfe"} Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.674074 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:30 crc kubenswrapper[5094]: W0220 08:45:30.675778 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecd0e105_6bf4_436e_9d70_1b42f662e67f.slice/crio-984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf WatchSource:0}: Error finding container 984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf: Status 404 returned error can't find the container with id 984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.679760 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-62kch" podStartSLOduration=1.679733987 podStartE2EDuration="1.679733987s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:30.672217975 +0000 UTC m=+7145.544844686" watchObservedRunningTime="2026-02-20 08:45:30.679733987 +0000 UTC m=+7145.552360718" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.709929 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.709983 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.710054 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.710080 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.719089 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.721552 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.731024 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.732589 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") pod \"nova-cell1-conductor-db-sync-cwpxs\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.739524 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.843151 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.892055 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:45:30 crc kubenswrapper[5094]: I0220 08:45:30.900935 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:45:30 crc kubenswrapper[5094]: W0220 08:45:30.924296 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67ea7a9d_b48a_4ea9_be81_50d152a57e58.slice/crio-aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c WatchSource:0}: Error finding container aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c: Status 404 returned error can't find the container with id aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.355959 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:45:31 crc kubenswrapper[5094]: W0220 08:45:31.368988 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod077dc649_6898_4f04_837d_b694decf612b.slice/crio-726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441 WatchSource:0}: Error finding container 726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441: Status 404 returned error can't find the container with id 726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441 Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.694881 5094 generic.go:334] "Generic (PLEG): container finished" podID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerID="c16a7fdf9fc7f05c88a3c26ca7db3574a5f1cb1c1c0097c7a0361cae6e703d9a" exitCode=0 Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.695012 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerDied","Data":"c16a7fdf9fc7f05c88a3c26ca7db3574a5f1cb1c1c0097c7a0361cae6e703d9a"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.695041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerStarted","Data":"aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.697721 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecd0e105-6bf4-436e-9d70-1b42f662e67f","Type":"ContainerStarted","Data":"984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.699752 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" event={"ID":"077dc649-6898-4f04-837d-b694decf612b","Type":"ContainerStarted","Data":"20cb7da637ab28cd5dddb0edb00930b7379bd84765eae45228fa5efc43d1c866"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.699778 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" event={"ID":"077dc649-6898-4f04-837d-b694decf612b","Type":"ContainerStarted","Data":"726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.702034 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b2421e1-8243-473f-8dd5-86bc130d251f","Type":"ContainerStarted","Data":"3235e0f9a591cd9fc84d38dd78cd295e9068409f0e216b0a40d76217b8e522fa"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.704662 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerStarted","Data":"2c8f9dde56772efe4f4ead50b7fa81668866d757af9f32b2557ab9828d4e4e5e"} Feb 20 08:45:31 crc kubenswrapper[5094]: I0220 08:45:31.747290 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" podStartSLOduration=1.747266283 podStartE2EDuration="1.747266283s" podCreationTimestamp="2026-02-20 08:45:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:31.731187016 +0000 UTC m=+7146.603813727" watchObservedRunningTime="2026-02-20 08:45:31.747266283 +0000 UTC m=+7146.619892994" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.728315 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b2421e1-8243-473f-8dd5-86bc130d251f","Type":"ContainerStarted","Data":"df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.730985 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerStarted","Data":"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.731024 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerStarted","Data":"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.733278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerStarted","Data":"362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.733729 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.741447 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecd0e105-6bf4-436e-9d70-1b42f662e67f","Type":"ContainerStarted","Data":"eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.761408 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.6816731369999998 podStartE2EDuration="4.761385547s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="2026-02-20 08:45:30.900780594 +0000 UTC m=+7145.773407305" lastFinishedPulling="2026-02-20 08:45:32.980492964 +0000 UTC m=+7147.853119715" observedRunningTime="2026-02-20 08:45:33.75115255 +0000 UTC m=+7148.623779261" watchObservedRunningTime="2026-02-20 08:45:33.761385547 +0000 UTC m=+7148.634012278" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.763639 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerStarted","Data":"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.763686 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerStarted","Data":"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51"} Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.786172 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" podStartSLOduration=4.786151762 podStartE2EDuration="4.786151762s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:33.774135463 +0000 UTC m=+7148.646762174" watchObservedRunningTime="2026-02-20 08:45:33.786151762 +0000 UTC m=+7148.658778473" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.812838 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.573476443 podStartE2EDuration="4.812819303s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="2026-02-20 08:45:30.741484852 +0000 UTC m=+7145.614111563" lastFinishedPulling="2026-02-20 08:45:32.980827712 +0000 UTC m=+7147.853454423" observedRunningTime="2026-02-20 08:45:33.798051778 +0000 UTC m=+7148.670678489" watchObservedRunningTime="2026-02-20 08:45:33.812819303 +0000 UTC m=+7148.685446014" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.819439 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5219750039999997 podStartE2EDuration="4.819422192s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="2026-02-20 08:45:30.67904759 +0000 UTC m=+7145.551674301" lastFinishedPulling="2026-02-20 08:45:32.976494778 +0000 UTC m=+7147.849121489" observedRunningTime="2026-02-20 08:45:33.814050143 +0000 UTC m=+7148.686676864" watchObservedRunningTime="2026-02-20 08:45:33.819422192 +0000 UTC m=+7148.692048903" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.839012 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.406714001 podStartE2EDuration="4.838994573s" podCreationTimestamp="2026-02-20 08:45:29 +0000 UTC" firstStartedPulling="2026-02-20 08:45:30.546313587 +0000 UTC m=+7145.418940288" lastFinishedPulling="2026-02-20 08:45:32.978594149 +0000 UTC m=+7147.851220860" observedRunningTime="2026-02-20 08:45:33.837081647 +0000 UTC m=+7148.709708368" watchObservedRunningTime="2026-02-20 08:45:33.838994573 +0000 UTC m=+7148.711621284" Feb 20 08:45:33 crc kubenswrapper[5094]: I0220 08:45:33.840172 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:45:33 crc kubenswrapper[5094]: E0220 08:45:33.840443 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:45:34 crc kubenswrapper[5094]: I0220 08:45:34.774043 5094 generic.go:334] "Generic (PLEG): container finished" podID="077dc649-6898-4f04-837d-b694decf612b" containerID="20cb7da637ab28cd5dddb0edb00930b7379bd84765eae45228fa5efc43d1c866" exitCode=0 Feb 20 08:45:34 crc kubenswrapper[5094]: I0220 08:45:34.774875 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" event={"ID":"077dc649-6898-4f04-837d-b694decf612b","Type":"ContainerDied","Data":"20cb7da637ab28cd5dddb0edb00930b7379bd84765eae45228fa5efc43d1c866"} Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.092533 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.178069 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.178118 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.225891 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.782999 5094 generic.go:334] "Generic (PLEG): container finished" podID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" containerID="a81d372de4e3c666bafe4f07d24f6f7cc4bee3b83fd66c5127214476a5ecb65e" exitCode=0 Feb 20 08:45:35 crc kubenswrapper[5094]: I0220 08:45:35.783110 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-62kch" event={"ID":"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3","Type":"ContainerDied","Data":"a81d372de4e3c666bafe4f07d24f6f7cc4bee3b83fd66c5127214476a5ecb65e"} Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.127653 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.216730 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") pod \"077dc649-6898-4f04-837d-b694decf612b\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.217117 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") pod \"077dc649-6898-4f04-837d-b694decf612b\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.217212 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") pod \"077dc649-6898-4f04-837d-b694decf612b\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.217235 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") pod \"077dc649-6898-4f04-837d-b694decf612b\" (UID: \"077dc649-6898-4f04-837d-b694decf612b\") " Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.222694 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts" (OuterVolumeSpecName: "scripts") pod "077dc649-6898-4f04-837d-b694decf612b" (UID: "077dc649-6898-4f04-837d-b694decf612b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.227123 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh" (OuterVolumeSpecName: "kube-api-access-vcnmh") pod "077dc649-6898-4f04-837d-b694decf612b" (UID: "077dc649-6898-4f04-837d-b694decf612b"). InnerVolumeSpecName "kube-api-access-vcnmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.271024 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data" (OuterVolumeSpecName: "config-data") pod "077dc649-6898-4f04-837d-b694decf612b" (UID: "077dc649-6898-4f04-837d-b694decf612b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.274967 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "077dc649-6898-4f04-837d-b694decf612b" (UID: "077dc649-6898-4f04-837d-b694decf612b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.318732 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.318755 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.318766 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077dc649-6898-4f04-837d-b694decf612b-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.318775 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcnmh\" (UniqueName: \"kubernetes.io/projected/077dc649-6898-4f04-837d-b694decf612b-kube-api-access-vcnmh\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.797568 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" event={"ID":"077dc649-6898-4f04-837d-b694decf612b","Type":"ContainerDied","Data":"726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441"} Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.797641 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cwpxs" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.797662 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="726988865e66909f2e9134f226f72595647874d8b06885775a28a8d3e68af441" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.892611 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:45:36 crc kubenswrapper[5094]: E0220 08:45:36.893293 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dc649-6898-4f04-837d-b694decf612b" containerName="nova-cell1-conductor-db-sync" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.893324 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dc649-6898-4f04-837d-b694decf612b" containerName="nova-cell1-conductor-db-sync" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.893646 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dc649-6898-4f04-837d-b694decf612b" containerName="nova-cell1-conductor-db-sync" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.894636 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.898692 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.903540 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.931764 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.931850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:36 crc kubenswrapper[5094]: I0220 08:45:36.931909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.033815 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.033871 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.033906 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.053815 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.054397 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.055050 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") pod \"nova-cell1-conductor-0\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.203232 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.229060 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.238230 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") pod \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.238368 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") pod \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.238398 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") pod \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.238491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") pod \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\" (UID: \"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3\") " Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.243853 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts" (OuterVolumeSpecName: "scripts") pod "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" (UID: "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.250092 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4" (OuterVolumeSpecName: "kube-api-access-4tkl4") pod "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" (UID: "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3"). InnerVolumeSpecName "kube-api-access-4tkl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.277120 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data" (OuterVolumeSpecName: "config-data") pod "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" (UID: "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.279783 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" (UID: "2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.332969 5094 scope.go:117] "RemoveContainer" containerID="3bce3a1b265a86956611d3f0e735c171240869a005a9b1236b337c2fd156e6f9" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.342891 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.342944 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.342963 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tkl4\" (UniqueName: \"kubernetes.io/projected/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-kube-api-access-4tkl4\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.342981 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.386314 5094 scope.go:117] "RemoveContainer" containerID="a7f01ab3dfebce16c461640e15ab5cb83ed76e8a8bf4b49d9de590c4cb6aacd4" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.681394 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:45:37 crc kubenswrapper[5094]: W0220 08:45:37.688880 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bd3b441_92b9_4fd4_8451_dec1c354915e.slice/crio-05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c WatchSource:0}: Error finding container 05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c: Status 404 returned error can't find the container with id 05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.808649 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-62kch" event={"ID":"2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3","Type":"ContainerDied","Data":"4451dcbad7b501ecea1158230334701023d79a8cdb69845458a3f3fcb82a0bfe"} Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.808681 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-62kch" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.808692 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4451dcbad7b501ecea1158230334701023d79a8cdb69845458a3f3fcb82a0bfe" Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.813699 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4bd3b441-92b9-4fd4-8451-dec1c354915e","Type":"ContainerStarted","Data":"05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c"} Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.990216 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.990780 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-log" containerID="cri-o://bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" gracePeriod=30 Feb 20 08:45:37 crc kubenswrapper[5094]: I0220 08:45:37.990944 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-api" containerID="cri-o://471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" gracePeriod=30 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.008493 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.008771 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerName="nova-scheduler-scheduler" containerID="cri-o://eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589" gracePeriod=30 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.020381 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.020574 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-log" containerID="cri-o://4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" gracePeriod=30 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.020721 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-metadata" containerID="cri-o://a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" gracePeriod=30 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.565082 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.573865 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667269 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") pod \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667396 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") pod \"cf10215b-d08a-452a-a7de-e7c828922d47\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") pod \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") pod \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667492 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") pod \"cf10215b-d08a-452a-a7de-e7c828922d47\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") pod \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\" (UID: \"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667585 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") pod \"cf10215b-d08a-452a-a7de-e7c828922d47\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.667657 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") pod \"cf10215b-d08a-452a-a7de-e7c828922d47\" (UID: \"cf10215b-d08a-452a-a7de-e7c828922d47\") " Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.670540 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs" (OuterVolumeSpecName: "logs") pod "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" (UID: "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.679767 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv" (OuterVolumeSpecName: "kube-api-access-59tqv") pod "cf10215b-d08a-452a-a7de-e7c828922d47" (UID: "cf10215b-d08a-452a-a7de-e7c828922d47"). InnerVolumeSpecName "kube-api-access-59tqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.680095 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs" (OuterVolumeSpecName: "logs") pod "cf10215b-d08a-452a-a7de-e7c828922d47" (UID: "cf10215b-d08a-452a-a7de-e7c828922d47"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.707054 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897" (OuterVolumeSpecName: "kube-api-access-8z897") pod "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" (UID: "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89"). InnerVolumeSpecName "kube-api-access-8z897". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.710884 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data" (OuterVolumeSpecName: "config-data") pod "cf10215b-d08a-452a-a7de-e7c828922d47" (UID: "cf10215b-d08a-452a-a7de-e7c828922d47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.734974 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" (UID: "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.751908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf10215b-d08a-452a-a7de-e7c828922d47" (UID: "cf10215b-d08a-452a-a7de-e7c828922d47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770263 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cf10215b-d08a-452a-a7de-e7c828922d47-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770295 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770306 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z897\" (UniqueName: \"kubernetes.io/projected/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-kube-api-access-8z897\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770316 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59tqv\" (UniqueName: \"kubernetes.io/projected/cf10215b-d08a-452a-a7de-e7c828922d47-kube-api-access-59tqv\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770325 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770333 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770346 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf10215b-d08a-452a-a7de-e7c828922d47-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.770923 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data" (OuterVolumeSpecName: "config-data") pod "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" (UID: "59f04fe3-56d8-4fcb-a1bf-35b730bd7d89"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823386 5094 generic.go:334] "Generic (PLEG): container finished" podID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" exitCode=0 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823419 5094 generic.go:334] "Generic (PLEG): container finished" podID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" exitCode=143 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823440 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerDied","Data":"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823467 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823494 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerDied","Data":"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823532 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"59f04fe3-56d8-4fcb-a1bf-35b730bd7d89","Type":"ContainerDied","Data":"9fcd6b0d8fc435c109aac005bdc71da0aba586fad59d4980840ac748c251b23e"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.823543 5094 scope.go:117] "RemoveContainer" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.827935 5094 generic.go:334] "Generic (PLEG): container finished" podID="cf10215b-d08a-452a-a7de-e7c828922d47" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" exitCode=0 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.827955 5094 generic.go:334] "Generic (PLEG): container finished" podID="cf10215b-d08a-452a-a7de-e7c828922d47" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" exitCode=143 Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.827992 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerDied","Data":"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.828014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerDied","Data":"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.828023 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cf10215b-d08a-452a-a7de-e7c828922d47","Type":"ContainerDied","Data":"2c8f9dde56772efe4f4ead50b7fa81668866d757af9f32b2557ab9828d4e4e5e"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.828070 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.835824 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4bd3b441-92b9-4fd4-8451-dec1c354915e","Type":"ContainerStarted","Data":"43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d"} Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.835969 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.844536 5094 scope.go:117] "RemoveContainer" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.865001 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.864982898 podStartE2EDuration="2.864982898s" podCreationTimestamp="2026-02-20 08:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:38.851417641 +0000 UTC m=+7153.724044352" watchObservedRunningTime="2026-02-20 08:45:38.864982898 +0000 UTC m=+7153.737609609" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.866873 5094 scope.go:117] "RemoveContainer" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.867549 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": container with ID starting with 471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55 not found: ID does not exist" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.867585 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55"} err="failed to get container status \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": rpc error: code = NotFound desc = could not find container \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": container with ID starting with 471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.867608 5094 scope.go:117] "RemoveContainer" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.868086 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": container with ID starting with bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51 not found: ID does not exist" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.868140 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51"} err="failed to get container status \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": rpc error: code = NotFound desc = could not find container \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": container with ID starting with bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.868174 5094 scope.go:117] "RemoveContainer" containerID="471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.874905 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55"} err="failed to get container status \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": rpc error: code = NotFound desc = could not find container \"471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55\": container with ID starting with 471654796cf417a469ee83c6512c59b3c64f1df0ced3934b130c14d9b9dd5c55 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.874948 5094 scope.go:117] "RemoveContainer" containerID="bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.877042 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.878023 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51"} err="failed to get container status \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": rpc error: code = NotFound desc = could not find container \"bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51\": container with ID starting with bf20f4dbe73503c384b1544e33e39a6bb08d136080eebd131394471604cc9e51 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.878053 5094 scope.go:117] "RemoveContainer" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.889853 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.896043 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.899571 5094 scope.go:117] "RemoveContainer" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.920921 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.922981 5094 scope.go:117] "RemoveContainer" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.923384 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": container with ID starting with a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2 not found: ID does not exist" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.923416 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2"} err="failed to get container status \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": rpc error: code = NotFound desc = could not find container \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": container with ID starting with a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.923441 5094 scope.go:117] "RemoveContainer" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.925532 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": container with ID starting with 4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866 not found: ID does not exist" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.925552 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866"} err="failed to get container status \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": rpc error: code = NotFound desc = could not find container \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": container with ID starting with 4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.925566 5094 scope.go:117] "RemoveContainer" containerID="a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.925901 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2"} err="failed to get container status \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": rpc error: code = NotFound desc = could not find container \"a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2\": container with ID starting with a241309b3485f187831ed61fc5e3a230de31fe6db9bced27f0a39725b95a95d2 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.925923 5094 scope.go:117] "RemoveContainer" containerID="4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.927567 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866"} err="failed to get container status \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": rpc error: code = NotFound desc = could not find container \"4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866\": container with ID starting with 4f0e70c76685e40e5120064da256ec54bd4c4fc1c1e520517acd6d017aeab866 not found: ID does not exist" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928479 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928883 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-metadata" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928901 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-metadata" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928930 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-api" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928936 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-api" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928947 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-log" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928953 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-log" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928959 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-log" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928965 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-log" Feb 20 08:45:38 crc kubenswrapper[5094]: E0220 08:45:38.928986 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" containerName="nova-manage" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.928992 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" containerName="nova-manage" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929135 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" containerName="nova-manage" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929147 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-log" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929155 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-metadata" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929168 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" containerName="nova-api-api" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.929174 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" containerName="nova-metadata-log" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.930201 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.935815 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.958486 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.968070 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.974750 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.976363 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.980066 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.980241 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.980365 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.980461 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.981071 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 08:45:38 crc kubenswrapper[5094]: I0220 08:45:38.991911 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081653 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081734 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081768 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081812 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081937 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.081960 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.083289 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.085636 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.088829 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.100444 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") pod \"nova-api-0\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.183308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.183397 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.183441 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.183466 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.184272 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.190323 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.191410 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.199514 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") pod \"nova-metadata-0\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.252844 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.301839 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.701426 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:39 crc kubenswrapper[5094]: W0220 08:45:39.707121 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481ed5ff_4180_4ff6_8d5f_b7876b484fb2.slice/crio-f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37 WatchSource:0}: Error finding container f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37: Status 404 returned error can't find the container with id f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37 Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.780297 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.851180 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f04fe3-56d8-4fcb-a1bf-35b730bd7d89" path="/var/lib/kubelet/pods/59f04fe3-56d8-4fcb-a1bf-35b730bd7d89/volumes" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.853732 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf10215b-d08a-452a-a7de-e7c828922d47" path="/var/lib/kubelet/pods/cf10215b-d08a-452a-a7de-e7c828922d47/volumes" Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.855098 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerStarted","Data":"0e6070237df8767b14a2b477f51aaced221d1a725981607f33d88c8bcb05cbb9"} Feb 20 08:45:39 crc kubenswrapper[5094]: I0220 08:45:39.855138 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerStarted","Data":"f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.226941 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.239240 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.299867 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.366644 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.366924 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-567db69c47-cctzv" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="dnsmasq-dns" containerID="cri-o://20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72" gracePeriod=10 Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.867743 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerDied","Data":"20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.867689 5094 generic.go:334] "Generic (PLEG): container finished" podID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerID="20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72" exitCode=0 Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.870043 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerStarted","Data":"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.870313 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerStarted","Data":"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.879864 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerStarted","Data":"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.879907 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerStarted","Data":"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.881346 5094 generic.go:334] "Generic (PLEG): container finished" podID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerID="eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589" exitCode=0 Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.881414 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecd0e105-6bf4-436e-9d70-1b42f662e67f","Type":"ContainerDied","Data":"eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589"} Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.906087 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.90606666 podStartE2EDuration="2.90606666s" podCreationTimestamp="2026-02-20 08:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:40.89199855 +0000 UTC m=+7155.764625261" watchObservedRunningTime="2026-02-20 08:45:40.90606666 +0000 UTC m=+7155.778693371" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.907377 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:45:40 crc kubenswrapper[5094]: I0220 08:45:40.919767 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.919751728 podStartE2EDuration="2.919751728s" podCreationTimestamp="2026-02-20 08:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:40.915100597 +0000 UTC m=+7155.787727308" watchObservedRunningTime="2026-02-20 08:45:40.919751728 +0000 UTC m=+7155.792378439" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.210721 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.301404 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.334831 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") pod \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.334891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") pod \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.335312 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") pod \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\" (UID: \"ecd0e105-6bf4-436e-9d70-1b42f662e67f\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.340035 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7" (OuterVolumeSpecName: "kube-api-access-2ljr7") pod "ecd0e105-6bf4-436e-9d70-1b42f662e67f" (UID: "ecd0e105-6bf4-436e-9d70-1b42f662e67f"). InnerVolumeSpecName "kube-api-access-2ljr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.361431 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecd0e105-6bf4-436e-9d70-1b42f662e67f" (UID: "ecd0e105-6bf4-436e-9d70-1b42f662e67f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.364154 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data" (OuterVolumeSpecName: "config-data") pod "ecd0e105-6bf4-436e-9d70-1b42f662e67f" (UID: "ecd0e105-6bf4-436e-9d70-1b42f662e67f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437338 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437444 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437541 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.437625 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") pod \"c8a1891c-fffd-4032-9384-bef764ca9f57\" (UID: \"c8a1891c-fffd-4032-9384-bef764ca9f57\") " Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.438070 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.438096 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecd0e105-6bf4-436e-9d70-1b42f662e67f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.438568 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ljr7\" (UniqueName: \"kubernetes.io/projected/ecd0e105-6bf4-436e-9d70-1b42f662e67f-kube-api-access-2ljr7\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.440477 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt" (OuterVolumeSpecName: "kube-api-access-dvlvt") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "kube-api-access-dvlvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.476454 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.476467 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.479049 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config" (OuterVolumeSpecName: "config") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.486016 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8a1891c-fffd-4032-9384-bef764ca9f57" (UID: "c8a1891c-fffd-4032-9384-bef764ca9f57"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539658 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539693 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539726 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539747 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a1891c-fffd-4032-9384-bef764ca9f57-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.539760 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvlvt\" (UniqueName: \"kubernetes.io/projected/c8a1891c-fffd-4032-9384-bef764ca9f57-kube-api-access-dvlvt\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.890778 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.890807 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ecd0e105-6bf4-436e-9d70-1b42f662e67f","Type":"ContainerDied","Data":"984caf900fe47312a9e7a1b27d29fca7525debf96ff707a7c96772dcdf30d5cf"} Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.891175 5094 scope.go:117] "RemoveContainer" containerID="eff10a7e15bd494bb0f1c45b333f9c99c72283453691bb43a221edbe3d81d589" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.896247 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-567db69c47-cctzv" event={"ID":"c8a1891c-fffd-4032-9384-bef764ca9f57","Type":"ContainerDied","Data":"ac86d266f2e6539516ad0f20e1cd64fe7ffd8192e43ce128347406606139c997"} Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.897245 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-567db69c47-cctzv" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.928225 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.931835 5094 scope.go:117] "RemoveContainer" containerID="20b70321388fb9ccdce0f5d0ab13ac7703a9ad371987c42a5b72759e55aa7f72" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.942263 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955260 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:41 crc kubenswrapper[5094]: E0220 08:45:41.955717 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerName="nova-scheduler-scheduler" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955734 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerName="nova-scheduler-scheduler" Feb 20 08:45:41 crc kubenswrapper[5094]: E0220 08:45:41.955754 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="init" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955760 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="init" Feb 20 08:45:41 crc kubenswrapper[5094]: E0220 08:45:41.955786 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="dnsmasq-dns" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955792 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="dnsmasq-dns" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955959 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" containerName="nova-scheduler-scheduler" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.955973 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" containerName="dnsmasq-dns" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.956646 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.958824 5094 scope.go:117] "RemoveContainer" containerID="e5df72384fdffb4deabf4c6cbb6c43678269bc4fec968e5b22a2e15228a210f5" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.966529 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.967863 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.978678 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:41 crc kubenswrapper[5094]: I0220 08:45:41.987149 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-567db69c47-cctzv"] Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.049679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.050070 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.050159 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.151749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.151841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.151901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.155553 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.156278 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.171163 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") pod \"nova-scheduler-0\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.269202 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.272427 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.707262 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.708684 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.711951 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.718285 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.720319 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.758265 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:42 crc kubenswrapper[5094]: W0220 08:45:42.761981 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e15e686_66dc_4bb3_989f_d1f84b318cf7.slice/crio-f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496 WatchSource:0}: Error finding container f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496: Status 404 returned error can't find the container with id f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496 Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.764634 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.764799 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.764914 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.765028 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.866986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.867069 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.867130 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.867168 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.870898 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.871200 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.872047 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.885144 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") pod \"nova-cell1-cell-mapping-5cthc\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:42 crc kubenswrapper[5094]: I0220 08:45:42.905350 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e15e686-66dc-4bb3-989f-d1f84b318cf7","Type":"ContainerStarted","Data":"f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496"} Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.024125 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.475801 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.854056 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a1891c-fffd-4032-9384-bef764ca9f57" path="/var/lib/kubelet/pods/c8a1891c-fffd-4032-9384-bef764ca9f57/volumes" Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.856552 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd0e105-6bf4-436e-9d70-1b42f662e67f" path="/var/lib/kubelet/pods/ecd0e105-6bf4-436e-9d70-1b42f662e67f/volumes" Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.916955 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5cthc" event={"ID":"47f4c643-fb8b-41d6-97b5-fa0c0928f370","Type":"ContainerStarted","Data":"8c8e45f1f20160f7c96239278e797286c527baed5911fbde0292c56051da6b16"} Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.918025 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5cthc" event={"ID":"47f4c643-fb8b-41d6-97b5-fa0c0928f370","Type":"ContainerStarted","Data":"243c552ada916fbd8c98356e79d33549118c50125ea4dea29e072301c1c2979e"} Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.925165 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e15e686-66dc-4bb3-989f-d1f84b318cf7","Type":"ContainerStarted","Data":"ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28"} Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.935553 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-5cthc" podStartSLOduration=1.935534154 podStartE2EDuration="1.935534154s" podCreationTimestamp="2026-02-20 08:45:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:43.933379922 +0000 UTC m=+7158.806006643" watchObservedRunningTime="2026-02-20 08:45:43.935534154 +0000 UTC m=+7158.808160865" Feb 20 08:45:43 crc kubenswrapper[5094]: I0220 08:45:43.961323 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.961300804 podStartE2EDuration="2.961300804s" podCreationTimestamp="2026-02-20 08:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:43.954399598 +0000 UTC m=+7158.827026309" watchObservedRunningTime="2026-02-20 08:45:43.961300804 +0000 UTC m=+7158.833927515" Feb 20 08:45:44 crc kubenswrapper[5094]: I0220 08:45:44.302951 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:45:44 crc kubenswrapper[5094]: I0220 08:45:44.304801 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:45:47 crc kubenswrapper[5094]: I0220 08:45:47.273524 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 08:45:48 crc kubenswrapper[5094]: I0220 08:45:48.840833 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:45:48 crc kubenswrapper[5094]: E0220 08:45:48.841155 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:45:48 crc kubenswrapper[5094]: I0220 08:45:48.984887 5094 generic.go:334] "Generic (PLEG): container finished" podID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" containerID="8c8e45f1f20160f7c96239278e797286c527baed5911fbde0292c56051da6b16" exitCode=0 Feb 20 08:45:48 crc kubenswrapper[5094]: I0220 08:45:48.984992 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5cthc" event={"ID":"47f4c643-fb8b-41d6-97b5-fa0c0928f370","Type":"ContainerDied","Data":"8c8e45f1f20160f7c96239278e797286c527baed5911fbde0292c56051da6b16"} Feb 20 08:45:49 crc kubenswrapper[5094]: I0220 08:45:49.254050 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:45:49 crc kubenswrapper[5094]: I0220 08:45:49.254112 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:45:49 crc kubenswrapper[5094]: I0220 08:45:49.303024 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:45:49 crc kubenswrapper[5094]: I0220 08:45:49.303329 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.337027 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.337161 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.70:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.417990 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.421193 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.421191 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.71:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.525332 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") pod \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.525386 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") pod \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.525581 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") pod \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.525868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") pod \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\" (UID: \"47f4c643-fb8b-41d6-97b5-fa0c0928f370\") " Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.531590 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d" (OuterVolumeSpecName: "kube-api-access-q6n5d") pod "47f4c643-fb8b-41d6-97b5-fa0c0928f370" (UID: "47f4c643-fb8b-41d6-97b5-fa0c0928f370"). InnerVolumeSpecName "kube-api-access-q6n5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.532482 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts" (OuterVolumeSpecName: "scripts") pod "47f4c643-fb8b-41d6-97b5-fa0c0928f370" (UID: "47f4c643-fb8b-41d6-97b5-fa0c0928f370"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.559839 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47f4c643-fb8b-41d6-97b5-fa0c0928f370" (UID: "47f4c643-fb8b-41d6-97b5-fa0c0928f370"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.573824 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data" (OuterVolumeSpecName: "config-data") pod "47f4c643-fb8b-41d6-97b5-fa0c0928f370" (UID: "47f4c643-fb8b-41d6-97b5-fa0c0928f370"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.628499 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.628585 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6n5d\" (UniqueName: \"kubernetes.io/projected/47f4c643-fb8b-41d6-97b5-fa0c0928f370-kube-api-access-q6n5d\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.628606 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:50 crc kubenswrapper[5094]: I0220 08:45:50.628624 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47f4c643-fb8b-41d6-97b5-fa0c0928f370-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.023131 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-5cthc" event={"ID":"47f4c643-fb8b-41d6-97b5-fa0c0928f370","Type":"ContainerDied","Data":"243c552ada916fbd8c98356e79d33549118c50125ea4dea29e072301c1c2979e"} Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.023188 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="243c552ada916fbd8c98356e79d33549118c50125ea4dea29e072301c1c2979e" Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.023225 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-5cthc" Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.205948 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.206303 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" containerID="cri-o://a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" gracePeriod=30 Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.206357 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" containerID="cri-o://33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" gracePeriod=30 Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.230331 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.230675 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerName="nova-scheduler-scheduler" containerID="cri-o://ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28" gracePeriod=30 Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.301879 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.302409 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" containerID="cri-o://e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" gracePeriod=30 Feb 20 08:45:51 crc kubenswrapper[5094]: I0220 08:45:51.302586 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" containerID="cri-o://5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" gracePeriod=30 Feb 20 08:45:52 crc kubenswrapper[5094]: I0220 08:45:52.045960 5094 generic.go:334] "Generic (PLEG): container finished" podID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerID="e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" exitCode=143 Feb 20 08:45:52 crc kubenswrapper[5094]: I0220 08:45:52.046143 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerDied","Data":"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd"} Feb 20 08:45:52 crc kubenswrapper[5094]: I0220 08:45:52.061674 5094 generic.go:334] "Generic (PLEG): container finished" podID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerID="a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" exitCode=143 Feb 20 08:45:52 crc kubenswrapper[5094]: I0220 08:45:52.061793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerDied","Data":"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf"} Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.084740 5094 generic.go:334] "Generic (PLEG): container finished" podID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerID="ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28" exitCode=0 Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.084816 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e15e686-66dc-4bb3-989f-d1f84b318cf7","Type":"ContainerDied","Data":"ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28"} Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.344979 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.399632 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") pod \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.399896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") pod \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.399935 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") pod \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\" (UID: \"2e15e686-66dc-4bb3-989f-d1f84b318cf7\") " Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.407627 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf" (OuterVolumeSpecName: "kube-api-access-bn7mf") pod "2e15e686-66dc-4bb3-989f-d1f84b318cf7" (UID: "2e15e686-66dc-4bb3-989f-d1f84b318cf7"). InnerVolumeSpecName "kube-api-access-bn7mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.427194 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e15e686-66dc-4bb3-989f-d1f84b318cf7" (UID: "2e15e686-66dc-4bb3-989f-d1f84b318cf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.428608 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data" (OuterVolumeSpecName: "config-data") pod "2e15e686-66dc-4bb3-989f-d1f84b318cf7" (UID: "2e15e686-66dc-4bb3-989f-d1f84b318cf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.502096 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.502138 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn7mf\" (UniqueName: \"kubernetes.io/projected/2e15e686-66dc-4bb3-989f-d1f84b318cf7-kube-api-access-bn7mf\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.502149 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e15e686-66dc-4bb3-989f-d1f84b318cf7-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:54 crc kubenswrapper[5094]: I0220 08:45:54.871823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.011539 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") pod \"03adc315-89a1-44e2-b06f-f2279bd0805f\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.011600 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") pod \"03adc315-89a1-44e2-b06f-f2279bd0805f\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.011806 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") pod \"03adc315-89a1-44e2-b06f-f2279bd0805f\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.012191 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs" (OuterVolumeSpecName: "logs") pod "03adc315-89a1-44e2-b06f-f2279bd0805f" (UID: "03adc315-89a1-44e2-b06f-f2279bd0805f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.012383 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") pod \"03adc315-89a1-44e2-b06f-f2279bd0805f\" (UID: \"03adc315-89a1-44e2-b06f-f2279bd0805f\") " Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.013098 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03adc315-89a1-44e2-b06f-f2279bd0805f-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.016397 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr" (OuterVolumeSpecName: "kube-api-access-zlhhr") pod "03adc315-89a1-44e2-b06f-f2279bd0805f" (UID: "03adc315-89a1-44e2-b06f-f2279bd0805f"). InnerVolumeSpecName "kube-api-access-zlhhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.038161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data" (OuterVolumeSpecName: "config-data") pod "03adc315-89a1-44e2-b06f-f2279bd0805f" (UID: "03adc315-89a1-44e2-b06f-f2279bd0805f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.058045 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03adc315-89a1-44e2-b06f-f2279bd0805f" (UID: "03adc315-89a1-44e2-b06f-f2279bd0805f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095244 5094 generic.go:334] "Generic (PLEG): container finished" podID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerID="5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" exitCode=0 Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095344 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerDied","Data":"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341"} Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095376 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"03adc315-89a1-44e2-b06f-f2279bd0805f","Type":"ContainerDied","Data":"0e6070237df8767b14a2b477f51aaced221d1a725981607f33d88c8bcb05cbb9"} Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095399 5094 scope.go:117] "RemoveContainer" containerID="5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.095541 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.097827 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2e15e686-66dc-4bb3-989f-d1f84b318cf7","Type":"ContainerDied","Data":"f4ce19ef2565a3b6eabf7f55a97f70c4b62b04d5648dcef78143beeffc69d496"} Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.097916 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.115650 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlhhr\" (UniqueName: \"kubernetes.io/projected/03adc315-89a1-44e2-b06f-f2279bd0805f-kube-api-access-zlhhr\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.115683 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.115694 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03adc315-89a1-44e2-b06f-f2279bd0805f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.129939 5094 scope.go:117] "RemoveContainer" containerID="e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.154892 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.173100 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.181948 5094 scope.go:117] "RemoveContainer" containerID="5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.184590 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341\": container with ID starting with 5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341 not found: ID does not exist" containerID="5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.184620 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341"} err="failed to get container status \"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341\": rpc error: code = NotFound desc = could not find container \"5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341\": container with ID starting with 5ca1c7e15cb6d9d823eb80f708a2c7294008bcec7b44ac10b83d1221ebe02341 not found: ID does not exist" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.184639 5094 scope.go:117] "RemoveContainer" containerID="e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.185818 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd\": container with ID starting with e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd not found: ID does not exist" containerID="e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.185837 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd"} err="failed to get container status \"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd\": rpc error: code = NotFound desc = could not find container \"e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd\": container with ID starting with e7475258d7537ed8a05219d178f13988fa9c6b085bc9ec476ed3d839bed3afbd not found: ID does not exist" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.185850 5094 scope.go:117] "RemoveContainer" containerID="ba3fb8433e297295f6890e9d15da15d3e69001dff8980a0f56301d98c2a70d28" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.200991 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.212643 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221292 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.221630 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" containerName="nova-manage" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221641 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" containerName="nova-manage" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.221652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221658 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.221674 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerName="nova-scheduler-scheduler" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221680 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerName="nova-scheduler-scheduler" Feb 20 08:45:55 crc kubenswrapper[5094]: E0220 08:45:55.221687 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221693 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221882 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" containerName="nova-manage" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221903 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-metadata" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221918 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" containerName="nova-scheduler-scheduler" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.221929 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" containerName="nova-metadata-log" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.222859 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.224948 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.231651 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.244913 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.246426 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.248685 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.263562 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.317955 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318017 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318230 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318430 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318518 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.318742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420565 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420625 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420689 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420733 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420758 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420788 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.420823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.421160 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.426229 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.429185 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.431367 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.431413 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.437039 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") pod \"nova-scheduler-0\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.441331 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") pod \"nova-metadata-0\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.544114 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.566619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.867642 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03adc315-89a1-44e2-b06f-f2279bd0805f" path="/var/lib/kubelet/pods/03adc315-89a1-44e2-b06f-f2279bd0805f/volumes" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.869067 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e15e686-66dc-4bb3-989f-d1f84b318cf7" path="/var/lib/kubelet/pods/2e15e686-66dc-4bb3-989f-d1f84b318cf7/volumes" Feb 20 08:45:55 crc kubenswrapper[5094]: I0220 08:45:55.949289 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: W0220 08:45:56.017460 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0bce5a1d_755b_4c0a_b9aa_1fce8cbfc8b8.slice/crio-1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85 WatchSource:0}: Error finding container 1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85: Status 404 returned error can't find the container with id 1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85 Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.018984 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.031022 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") pod \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.031157 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") pod \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.031268 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") pod \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.031315 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") pod \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\" (UID: \"481ed5ff-4180-4ff6-8d5f-b7876b484fb2\") " Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.032396 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs" (OuterVolumeSpecName: "logs") pod "481ed5ff-4180-4ff6-8d5f-b7876b484fb2" (UID: "481ed5ff-4180-4ff6-8d5f-b7876b484fb2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.036641 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw" (OuterVolumeSpecName: "kube-api-access-mknbw") pod "481ed5ff-4180-4ff6-8d5f-b7876b484fb2" (UID: "481ed5ff-4180-4ff6-8d5f-b7876b484fb2"). InnerVolumeSpecName "kube-api-access-mknbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.054596 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "481ed5ff-4180-4ff6-8d5f-b7876b484fb2" (UID: "481ed5ff-4180-4ff6-8d5f-b7876b484fb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.061547 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data" (OuterVolumeSpecName: "config-data") pod "481ed5ff-4180-4ff6-8d5f-b7876b484fb2" (UID: "481ed5ff-4180-4ff6-8d5f-b7876b484fb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118291 5094 generic.go:334] "Generic (PLEG): container finished" podID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerID="33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" exitCode=0 Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118646 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerDied","Data":"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087"} Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118676 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"481ed5ff-4180-4ff6-8d5f-b7876b484fb2","Type":"ContainerDied","Data":"f3b21367601b2d8d9c4f3af3cf1e049f73d472a734810328e7d1183321ab8d37"} Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118696 5094 scope.go:117] "RemoveContainer" containerID="33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.118866 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.121849 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.122733 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerStarted","Data":"1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85"} Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.132777 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.132796 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.132806 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.132818 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mknbw\" (UniqueName: \"kubernetes.io/projected/481ed5ff-4180-4ff6-8d5f-b7876b484fb2-kube-api-access-mknbw\") on node \"crc\" DevicePath \"\"" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.270139 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.273580 5094 scope.go:117] "RemoveContainer" containerID="a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.287010 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.299398 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.301035 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod481ed5ff_4180_4ff6_8d5f_b7876b484fb2.slice\": RecentStats: unable to find data in memory cache]" Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.303284 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.303308 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.303334 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.303342 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.303547 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-api" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.303561 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" containerName="nova-api-log" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.304482 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.307037 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.318386 5094 scope.go:117] "RemoveContainer" containerID="33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.319509 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087\": container with ID starting with 33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087 not found: ID does not exist" containerID="33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.322454 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087"} err="failed to get container status \"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087\": rpc error: code = NotFound desc = could not find container \"33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087\": container with ID starting with 33cdd137dbb0b138ccaa238a59c498915d20088d8ecfd36abd345e1ae43df087 not found: ID does not exist" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.322499 5094 scope.go:117] "RemoveContainer" containerID="a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" Feb 20 08:45:56 crc kubenswrapper[5094]: E0220 08:45:56.323514 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf\": container with ID starting with a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf not found: ID does not exist" containerID="a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.323558 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf"} err="failed to get container status \"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf\": rpc error: code = NotFound desc = could not find container \"a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf\": container with ID starting with a6fc48bcb490e8f80490f7e2f6378f1e5b84f106073ec34ad5101cf96c2294bf not found: ID does not exist" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.323604 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.436792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.437061 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.437115 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.437319 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.538658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.538713 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.538745 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.538823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.539110 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.542853 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.543694 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.555457 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") pod \"nova-api-0\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " pod="openstack/nova-api-0" Feb 20 08:45:56 crc kubenswrapper[5094]: I0220 08:45:56.622540 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.041868 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.144119 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerStarted","Data":"33ea9e800121c631e21658cce4961167b1aecb67f072d8888e0fe3827be668d0"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.146027 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"008eca6a-12d6-40dd-96bb-391428bd27c5","Type":"ContainerStarted","Data":"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.146070 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"008eca6a-12d6-40dd-96bb-391428bd27c5","Type":"ContainerStarted","Data":"0e61e77c80dfdbd6b142c3a986f85177ae881def4dc50810f9959c9b8afee96d"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.154577 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerStarted","Data":"ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.154629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerStarted","Data":"98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49"} Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.175684 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.175666153 podStartE2EDuration="2.175666153s" podCreationTimestamp="2026-02-20 08:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:57.164253449 +0000 UTC m=+7172.036880230" watchObservedRunningTime="2026-02-20 08:45:57.175666153 +0000 UTC m=+7172.048292864" Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.183806 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.183788859 podStartE2EDuration="2.183788859s" podCreationTimestamp="2026-02-20 08:45:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:57.179647449 +0000 UTC m=+7172.052274170" watchObservedRunningTime="2026-02-20 08:45:57.183788859 +0000 UTC m=+7172.056415570" Feb 20 08:45:57 crc kubenswrapper[5094]: I0220 08:45:57.850213 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="481ed5ff-4180-4ff6-8d5f-b7876b484fb2" path="/var/lib/kubelet/pods/481ed5ff-4180-4ff6-8d5f-b7876b484fb2/volumes" Feb 20 08:45:58 crc kubenswrapper[5094]: I0220 08:45:58.164248 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerStarted","Data":"c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810"} Feb 20 08:45:58 crc kubenswrapper[5094]: I0220 08:45:58.164285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerStarted","Data":"ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f"} Feb 20 08:45:58 crc kubenswrapper[5094]: I0220 08:45:58.201726 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.201675581 podStartE2EDuration="2.201675581s" podCreationTimestamp="2026-02-20 08:45:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:45:58.185557784 +0000 UTC m=+7173.058184515" watchObservedRunningTime="2026-02-20 08:45:58.201675581 +0000 UTC m=+7173.074302332" Feb 20 08:46:00 crc kubenswrapper[5094]: I0220 08:46:00.545117 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:46:00 crc kubenswrapper[5094]: I0220 08:46:00.545171 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:46:00 crc kubenswrapper[5094]: I0220 08:46:00.567779 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 08:46:00 crc kubenswrapper[5094]: I0220 08:46:00.840644 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:00 crc kubenswrapper[5094]: E0220 08:46:00.840912 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:05 crc kubenswrapper[5094]: I0220 08:46:05.545233 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:46:05 crc kubenswrapper[5094]: I0220 08:46:05.545773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:46:05 crc kubenswrapper[5094]: I0220 08:46:05.567785 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 08:46:05 crc kubenswrapper[5094]: I0220 08:46:05.611196 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.287180 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.622867 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.622925 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.627856 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:46:06 crc kubenswrapper[5094]: I0220 08:46:06.627870 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.74:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:46:07 crc kubenswrapper[5094]: I0220 08:46:07.705911 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:46:07 crc kubenswrapper[5094]: I0220 08:46:07.706200 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:46:12 crc kubenswrapper[5094]: I0220 08:46:12.840052 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:12 crc kubenswrapper[5094]: E0220 08:46:12.840629 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:15 crc kubenswrapper[5094]: I0220 08:46:15.546816 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 08:46:15 crc kubenswrapper[5094]: I0220 08:46:15.547102 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 08:46:15 crc kubenswrapper[5094]: I0220 08:46:15.548548 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 08:46:15 crc kubenswrapper[5094]: I0220 08:46:15.548954 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.627790 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.627874 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.628653 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.628685 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.632096 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.632345 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.868291 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.873110 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:16 crc kubenswrapper[5094]: I0220 08:46:16.907109 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.015663 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.015952 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.016049 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.016184 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.016255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117311 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117627 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117731 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.117986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.118517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.118622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.118906 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.119063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.134419 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") pod \"dnsmasq-dns-84fd48bb75-4njkr\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.210207 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:17 crc kubenswrapper[5094]: I0220 08:46:17.641894 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:46:18 crc kubenswrapper[5094]: I0220 08:46:18.371193 5094 generic.go:334] "Generic (PLEG): container finished" podID="4b88f14c-752a-4565-848b-8fb7820295db" containerID="1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd" exitCode=0 Feb 20 08:46:18 crc kubenswrapper[5094]: I0220 08:46:18.371278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerDied","Data":"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd"} Feb 20 08:46:18 crc kubenswrapper[5094]: I0220 08:46:18.371673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerStarted","Data":"1c3f648da5272f28b64986994942baf398d516fa729e7f58160c062950c2a99e"} Feb 20 08:46:19 crc kubenswrapper[5094]: I0220 08:46:19.381025 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerStarted","Data":"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8"} Feb 20 08:46:19 crc kubenswrapper[5094]: I0220 08:46:19.381372 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:19 crc kubenswrapper[5094]: I0220 08:46:19.402470 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" podStartSLOduration=3.402450276 podStartE2EDuration="3.402450276s" podCreationTimestamp="2026-02-20 08:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:46:19.400673134 +0000 UTC m=+7194.273299845" watchObservedRunningTime="2026-02-20 08:46:19.402450276 +0000 UTC m=+7194.275077007" Feb 20 08:46:25 crc kubenswrapper[5094]: I0220 08:46:25.845719 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:25 crc kubenswrapper[5094]: E0220 08:46:25.846642 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.212154 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.295267 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.296209 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="dnsmasq-dns" containerID="cri-o://362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420" gracePeriod=10 Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.457299 5094 generic.go:334] "Generic (PLEG): container finished" podID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerID="362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420" exitCode=0 Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.457341 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerDied","Data":"362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420"} Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.778339 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913239 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913358 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913421 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913465 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.913497 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") pod \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\" (UID: \"67ea7a9d-b48a-4ea9-be81-50d152a57e58\") " Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.920916 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn" (OuterVolumeSpecName: "kube-api-access-jbwbn") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "kube-api-access-jbwbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.955199 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config" (OuterVolumeSpecName: "config") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.961436 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.987380 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:27 crc kubenswrapper[5094]: I0220 08:46:27.989376 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67ea7a9d-b48a-4ea9-be81-50d152a57e58" (UID: "67ea7a9d-b48a-4ea9-be81-50d152a57e58"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016063 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016111 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016129 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016148 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbwbn\" (UniqueName: \"kubernetes.io/projected/67ea7a9d-b48a-4ea9-be81-50d152a57e58-kube-api-access-jbwbn\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.016166 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67ea7a9d-b48a-4ea9-be81-50d152a57e58-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.467319 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" event={"ID":"67ea7a9d-b48a-4ea9-be81-50d152a57e58","Type":"ContainerDied","Data":"aecfbb6ad113706e1f13505e743a3ff12b31470b6a099b5823c226a7d30ce55c"} Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.467380 5094 scope.go:117] "RemoveContainer" containerID="362e6ae91ccecf694cdc7738cb8ce59162ad28f767e767b41fae829ceaf54420" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.467376 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9dcb44685-h54hc" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.502485 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.504101 5094 scope.go:117] "RemoveContainer" containerID="c16a7fdf9fc7f05c88a3c26ca7db3574a5f1cb1c1c0097c7a0361cae6e703d9a" Feb 20 08:46:28 crc kubenswrapper[5094]: I0220 08:46:28.510873 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9dcb44685-h54hc"] Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.542140 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:46:29 crc kubenswrapper[5094]: E0220 08:46:29.542498 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="init" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.542510 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="init" Feb 20 08:46:29 crc kubenswrapper[5094]: E0220 08:46:29.542533 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="dnsmasq-dns" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.542539 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="dnsmasq-dns" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.542724 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" containerName="dnsmasq-dns" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.543361 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.552016 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.649049 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.649299 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.653899 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.655055 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.659821 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.668253 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.751289 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.751372 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.751470 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.751514 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.752929 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.779476 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") pod \"cinder-db-create-4ggkd\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.848202 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67ea7a9d-b48a-4ea9-be81-50d152a57e58" path="/var/lib/kubelet/pods/67ea7a9d-b48a-4ea9-be81-50d152a57e58/volumes" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.852795 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.852909 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.853492 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.858337 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.868003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") pod \"cinder-dc9b-account-create-update-6hd8k\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:29 crc kubenswrapper[5094]: I0220 08:46:29.980031 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.291177 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:46:30 crc kubenswrapper[5094]: W0220 08:46:30.292599 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d9bbb80_f9cc_40ef_b3d1_c5a5cea72991.slice/crio-19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d WatchSource:0}: Error finding container 19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d: Status 404 returned error can't find the container with id 19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d Feb 20 08:46:30 crc kubenswrapper[5094]: W0220 08:46:30.391398 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7475a056_ad82_42aa_85ee_4b5d6834434a.slice/crio-8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576 WatchSource:0}: Error finding container 8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576: Status 404 returned error can't find the container with id 8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576 Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.391748 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.483785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4ggkd" event={"ID":"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991","Type":"ContainerStarted","Data":"304a415638ed4b15154b7c41ed2540ace3b8c9ba6f1ff38016fbc2160491bb9c"} Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.483869 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4ggkd" event={"ID":"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991","Type":"ContainerStarted","Data":"19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d"} Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.485185 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dc9b-account-create-update-6hd8k" event={"ID":"7475a056-ad82-42aa-85ee-4b5d6834434a","Type":"ContainerStarted","Data":"8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576"} Feb 20 08:46:30 crc kubenswrapper[5094]: I0220 08:46:30.499823 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-4ggkd" podStartSLOduration=1.499780718 podStartE2EDuration="1.499780718s" podCreationTimestamp="2026-02-20 08:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:46:30.49614508 +0000 UTC m=+7205.368771791" watchObservedRunningTime="2026-02-20 08:46:30.499780718 +0000 UTC m=+7205.372407449" Feb 20 08:46:31 crc kubenswrapper[5094]: I0220 08:46:31.495489 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" containerID="304a415638ed4b15154b7c41ed2540ace3b8c9ba6f1ff38016fbc2160491bb9c" exitCode=0 Feb 20 08:46:31 crc kubenswrapper[5094]: I0220 08:46:31.495578 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4ggkd" event={"ID":"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991","Type":"ContainerDied","Data":"304a415638ed4b15154b7c41ed2540ace3b8c9ba6f1ff38016fbc2160491bb9c"} Feb 20 08:46:31 crc kubenswrapper[5094]: I0220 08:46:31.500774 5094 generic.go:334] "Generic (PLEG): container finished" podID="7475a056-ad82-42aa-85ee-4b5d6834434a" containerID="e06fc5fd3620d2019a01f12c26721ba58935bf528cffce9cee66802b4ab5054a" exitCode=0 Feb 20 08:46:31 crc kubenswrapper[5094]: I0220 08:46:31.500819 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dc9b-account-create-update-6hd8k" event={"ID":"7475a056-ad82-42aa-85ee-4b5d6834434a","Type":"ContainerDied","Data":"e06fc5fd3620d2019a01f12c26721ba58935bf528cffce9cee66802b4ab5054a"} Feb 20 08:46:32 crc kubenswrapper[5094]: I0220 08:46:32.877639 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:32 crc kubenswrapper[5094]: I0220 08:46:32.976891 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.016542 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") pod \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.016671 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") pod \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\" (UID: \"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991\") " Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.017425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" (UID: "3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.022656 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7" (OuterVolumeSpecName: "kube-api-access-bv6j7") pod "3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" (UID: "3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991"). InnerVolumeSpecName "kube-api-access-bv6j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.118806 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") pod \"7475a056-ad82-42aa-85ee-4b5d6834434a\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.118891 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") pod \"7475a056-ad82-42aa-85ee-4b5d6834434a\" (UID: \"7475a056-ad82-42aa-85ee-4b5d6834434a\") " Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.119197 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv6j7\" (UniqueName: \"kubernetes.io/projected/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-kube-api-access-bv6j7\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.119212 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.119492 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7475a056-ad82-42aa-85ee-4b5d6834434a" (UID: "7475a056-ad82-42aa-85ee-4b5d6834434a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.121344 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8" (OuterVolumeSpecName: "kube-api-access-wjnr8") pod "7475a056-ad82-42aa-85ee-4b5d6834434a" (UID: "7475a056-ad82-42aa-85ee-4b5d6834434a"). InnerVolumeSpecName "kube-api-access-wjnr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.220768 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjnr8\" (UniqueName: \"kubernetes.io/projected/7475a056-ad82-42aa-85ee-4b5d6834434a-kube-api-access-wjnr8\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.220841 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7475a056-ad82-42aa-85ee-4b5d6834434a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.524004 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-4ggkd" event={"ID":"3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991","Type":"ContainerDied","Data":"19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d"} Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.524042 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19b3baf919195ae5c3215c55724596ff0d01bd3eb70aab93ba302d7ed3b42b1d" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.524103 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-4ggkd" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.527387 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-dc9b-account-create-update-6hd8k" event={"ID":"7475a056-ad82-42aa-85ee-4b5d6834434a","Type":"ContainerDied","Data":"8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576"} Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.527431 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e11e82accca1d45cfec13e5fcee565daa011139d2f502ad3af9c6cf2dfb2576" Feb 20 08:46:33 crc kubenswrapper[5094]: I0220 08:46:33.527499 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-dc9b-account-create-update-6hd8k" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.893433 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:46:34 crc kubenswrapper[5094]: E0220 08:46:34.894002 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" containerName="mariadb-database-create" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.894025 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" containerName="mariadb-database-create" Feb 20 08:46:34 crc kubenswrapper[5094]: E0220 08:46:34.894045 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7475a056-ad82-42aa-85ee-4b5d6834434a" containerName="mariadb-account-create-update" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.894057 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7475a056-ad82-42aa-85ee-4b5d6834434a" containerName="mariadb-account-create-update" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.894370 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7475a056-ad82-42aa-85ee-4b5d6834434a" containerName="mariadb-account-create-update" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.894402 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" containerName="mariadb-database-create" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.895226 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.903329 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.903517 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.904800 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xwpqq" Feb 20 08:46:34 crc kubenswrapper[5094]: I0220 08:46:34.906467 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053187 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053231 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053255 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053310 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053365 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.053384 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155113 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155206 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155229 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155253 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155302 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155371 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.155417 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.161195 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.161195 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.161302 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.162317 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.173266 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") pod \"cinder-db-sync-n78qt\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.217225 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:35 crc kubenswrapper[5094]: I0220 08:46:35.663841 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:46:35 crc kubenswrapper[5094]: W0220 08:46:35.667139 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cbb5a80_aef6_405d_bf92_d0d9cc872c78.slice/crio-4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d WatchSource:0}: Error finding container 4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d: Status 404 returned error can't find the container with id 4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d Feb 20 08:46:36 crc kubenswrapper[5094]: I0220 08:46:36.572695 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n78qt" event={"ID":"0cbb5a80-aef6-405d-bf92-d0d9cc872c78","Type":"ContainerStarted","Data":"4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d"} Feb 20 08:46:39 crc kubenswrapper[5094]: I0220 08:46:39.840174 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:39 crc kubenswrapper[5094]: E0220 08:46:39.840733 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:51 crc kubenswrapper[5094]: I0220 08:46:51.844168 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:46:51 crc kubenswrapper[5094]: E0220 08:46:51.844852 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:46:54 crc kubenswrapper[5094]: I0220 08:46:54.738352 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n78qt" event={"ID":"0cbb5a80-aef6-405d-bf92-d0d9cc872c78","Type":"ContainerStarted","Data":"08a5be203668aa394a410644bd2e295ff1d095189ea60c660ca9c1800575b1fd"} Feb 20 08:46:54 crc kubenswrapper[5094]: I0220 08:46:54.766832 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-n78qt" podStartSLOduration=2.7379194460000003 podStartE2EDuration="20.766817205s" podCreationTimestamp="2026-02-20 08:46:34 +0000 UTC" firstStartedPulling="2026-02-20 08:46:35.669614732 +0000 UTC m=+7210.542241443" lastFinishedPulling="2026-02-20 08:46:53.698512491 +0000 UTC m=+7228.571139202" observedRunningTime="2026-02-20 08:46:54.759017767 +0000 UTC m=+7229.631644528" watchObservedRunningTime="2026-02-20 08:46:54.766817205 +0000 UTC m=+7229.639443916" Feb 20 08:46:56 crc kubenswrapper[5094]: I0220 08:46:56.753983 5094 generic.go:334] "Generic (PLEG): container finished" podID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" containerID="08a5be203668aa394a410644bd2e295ff1d095189ea60c660ca9c1800575b1fd" exitCode=0 Feb 20 08:46:56 crc kubenswrapper[5094]: I0220 08:46:56.754268 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n78qt" event={"ID":"0cbb5a80-aef6-405d-bf92-d0d9cc872c78","Type":"ContainerDied","Data":"08a5be203668aa394a410644bd2e295ff1d095189ea60c660ca9c1800575b1fd"} Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.067534 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.227771 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.227844 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.227871 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.227914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.228021 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.228043 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") pod \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\" (UID: \"0cbb5a80-aef6-405d-bf92-d0d9cc872c78\") " Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.228305 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.228483 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.232954 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.233022 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts" (OuterVolumeSpecName: "scripts") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.233345 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl" (OuterVolumeSpecName: "kube-api-access-wlwfl") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "kube-api-access-wlwfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.250855 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.286911 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data" (OuterVolumeSpecName: "config-data") pod "0cbb5a80-aef6-405d-bf92-d0d9cc872c78" (UID: "0cbb5a80-aef6-405d-bf92-d0d9cc872c78"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.330989 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.331034 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.331048 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.331059 5094 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.331072 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlwfl\" (UniqueName: \"kubernetes.io/projected/0cbb5a80-aef6-405d-bf92-d0d9cc872c78-kube-api-access-wlwfl\") on node \"crc\" DevicePath \"\"" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.774547 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n78qt" event={"ID":"0cbb5a80-aef6-405d-bf92-d0d9cc872c78","Type":"ContainerDied","Data":"4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d"} Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.774594 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f2f28f26a575d3a3ffbf892a30f944a346d706f099f73887583161f5992901d" Feb 20 08:46:58 crc kubenswrapper[5094]: I0220 08:46:58.774658 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n78qt" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.376747 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:46:59 crc kubenswrapper[5094]: E0220 08:46:59.377218 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" containerName="cinder-db-sync" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.377235 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" containerName="cinder-db-sync" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.377459 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" containerName="cinder-db-sync" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.378583 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.409001 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550169 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550231 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550276 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550293 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.550377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.552400 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.553903 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.556923 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.557598 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.557841 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.557960 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-xwpqq" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.564937 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.651909 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652249 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652268 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652296 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652322 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652355 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652400 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652439 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652458 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.652477 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.653468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.654028 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.654388 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.654519 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.671027 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") pod \"dnsmasq-dns-644b874bd7-7xjnn\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.697771 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760155 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760212 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760240 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760280 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760333 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760404 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.760436 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.762489 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.765700 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.765809 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.767611 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.774637 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.777200 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.783728 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") pod \"cinder-api-0\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " pod="openstack/cinder-api-0" Feb 20 08:46:59 crc kubenswrapper[5094]: I0220 08:46:59.869163 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.219441 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:47:00 crc kubenswrapper[5094]: W0220 08:47:00.392698 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7799fdb_4c3c_4792_be6d_f988852a6dad.slice/crio-0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad WatchSource:0}: Error finding container 0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad: Status 404 returned error can't find the container with id 0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.396868 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.818751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerStarted","Data":"0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad"} Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.820585 5094 generic.go:334] "Generic (PLEG): container finished" podID="4909c4ac-65fa-412c-990d-974868b0f104" containerID="f5767cd62a5a9e26fc88ffbe25eb74c9c4932ee6d1de8eb39356b77614dedec0" exitCode=0 Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.820616 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerDied","Data":"f5767cd62a5a9e26fc88ffbe25eb74c9c4932ee6d1de8eb39356b77614dedec0"} Feb 20 08:47:00 crc kubenswrapper[5094]: I0220 08:47:00.820631 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerStarted","Data":"8084f28745ebf13a7935e0af610ee153d1789476c3995420facfd289029eaab4"} Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.830799 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerStarted","Data":"08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a"} Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.831399 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.834555 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerStarted","Data":"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513"} Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.834607 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerStarted","Data":"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a"} Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.834737 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.876342 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" podStartSLOduration=2.876318352 podStartE2EDuration="2.876318352s" podCreationTimestamp="2026-02-20 08:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:01.854278282 +0000 UTC m=+7236.726905003" watchObservedRunningTime="2026-02-20 08:47:01.876318352 +0000 UTC m=+7236.748945073" Feb 20 08:47:01 crc kubenswrapper[5094]: I0220 08:47:01.886355 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.886333153 podStartE2EDuration="2.886333153s" podCreationTimestamp="2026-02-20 08:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:01.871598008 +0000 UTC m=+7236.744224719" watchObservedRunningTime="2026-02-20 08:47:01.886333153 +0000 UTC m=+7236.758959854" Feb 20 08:47:03 crc kubenswrapper[5094]: I0220 08:47:03.839879 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:03 crc kubenswrapper[5094]: E0220 08:47:03.840386 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:09 crc kubenswrapper[5094]: I0220 08:47:09.700041 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:47:09 crc kubenswrapper[5094]: I0220 08:47:09.803119 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:47:09 crc kubenswrapper[5094]: I0220 08:47:09.803468 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="dnsmasq-dns" containerID="cri-o://d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" gracePeriod=10 Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.338873 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.368264 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.369310 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.369397 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.369434 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.369484 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") pod \"4b88f14c-752a-4565-848b-8fb7820295db\" (UID: \"4b88f14c-752a-4565-848b-8fb7820295db\") " Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.374868 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7" (OuterVolumeSpecName: "kube-api-access-lp8v7") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "kube-api-access-lp8v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.420971 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config" (OuterVolumeSpecName: "config") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.442469 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.456198 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.459361 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4b88f14c-752a-4565-848b-8fb7820295db" (UID: "4b88f14c-752a-4565-848b-8fb7820295db"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471544 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471571 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp8v7\" (UniqueName: \"kubernetes.io/projected/4b88f14c-752a-4565-848b-8fb7820295db-kube-api-access-lp8v7\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471582 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471595 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.471603 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4b88f14c-752a-4565-848b-8fb7820295db-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536635 5094 generic.go:334] "Generic (PLEG): container finished" podID="4b88f14c-752a-4565-848b-8fb7820295db" containerID="d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" exitCode=0 Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536673 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerDied","Data":"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8"} Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536699 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" event={"ID":"4b88f14c-752a-4565-848b-8fb7820295db","Type":"ContainerDied","Data":"1c3f648da5272f28b64986994942baf398d516fa729e7f58160c062950c2a99e"} Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536730 5094 scope.go:117] "RemoveContainer" containerID="d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.536837 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84fd48bb75-4njkr" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.570072 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.573345 5094 scope.go:117] "RemoveContainer" containerID="1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.583301 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84fd48bb75-4njkr"] Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.617694 5094 scope.go:117] "RemoveContainer" containerID="d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" Feb 20 08:47:10 crc kubenswrapper[5094]: E0220 08:47:10.618172 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8\": container with ID starting with d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8 not found: ID does not exist" containerID="d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.618223 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8"} err="failed to get container status \"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8\": rpc error: code = NotFound desc = could not find container \"d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8\": container with ID starting with d12039d2f80d89c47082ff12d98c2303384b5a9706d174263440ba803b4d8bd8 not found: ID does not exist" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.618257 5094 scope.go:117] "RemoveContainer" containerID="1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd" Feb 20 08:47:10 crc kubenswrapper[5094]: E0220 08:47:10.619076 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd\": container with ID starting with 1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd not found: ID does not exist" containerID="1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd" Feb 20 08:47:10 crc kubenswrapper[5094]: I0220 08:47:10.619114 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd"} err="failed to get container status \"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd\": rpc error: code = NotFound desc = could not find container \"1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd\": container with ID starting with 1841a2e465ce1712f2969881d7f130fad139643616a8d74914bfde959d8674dd not found: ID does not exist" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.032004 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.032282 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerName="nova-scheduler-scheduler" containerID="cri-o://a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.049191 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.049472 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" containerID="cri-o://98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.049569 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" containerID="cri-o://ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.062492 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.062755 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" containerID="cri-o://ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.062793 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" containerID="cri-o://c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.076160 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.076392 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.087403 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.087636 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" containerID="cri-o://43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.122980 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.123170 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerName="nova-cell0-conductor-conductor" containerID="cri-o://e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58" gracePeriod=30 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.548515 5094 generic.go:334] "Generic (PLEG): container finished" podID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerID="df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422" exitCode=0 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.548564 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b2421e1-8243-473f-8dd5-86bc130d251f","Type":"ContainerDied","Data":"df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422"} Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.551360 5094 generic.go:334] "Generic (PLEG): container finished" podID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerID="ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f" exitCode=143 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.551391 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerDied","Data":"ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f"} Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.553040 5094 generic.go:334] "Generic (PLEG): container finished" podID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerID="98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49" exitCode=143 Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.553103 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerDied","Data":"98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49"} Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.772194 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.791198 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") pod \"1b2421e1-8243-473f-8dd5-86bc130d251f\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.791419 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") pod \"1b2421e1-8243-473f-8dd5-86bc130d251f\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.791480 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") pod \"1b2421e1-8243-473f-8dd5-86bc130d251f\" (UID: \"1b2421e1-8243-473f-8dd5-86bc130d251f\") " Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.796750 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd" (OuterVolumeSpecName: "kube-api-access-kh7fd") pod "1b2421e1-8243-473f-8dd5-86bc130d251f" (UID: "1b2421e1-8243-473f-8dd5-86bc130d251f"). InnerVolumeSpecName "kube-api-access-kh7fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.818257 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data" (OuterVolumeSpecName: "config-data") pod "1b2421e1-8243-473f-8dd5-86bc130d251f" (UID: "1b2421e1-8243-473f-8dd5-86bc130d251f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.832213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b2421e1-8243-473f-8dd5-86bc130d251f" (UID: "1b2421e1-8243-473f-8dd5-86bc130d251f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.862331 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b88f14c-752a-4565-848b-8fb7820295db" path="/var/lib/kubelet/pods/4b88f14c-752a-4565-848b-8fb7820295db/volumes" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.893759 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.893798 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b2421e1-8243-473f-8dd5-86bc130d251f-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:11 crc kubenswrapper[5094]: I0220 08:47:11.893812 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh7fd\" (UniqueName: \"kubernetes.io/projected/1b2421e1-8243-473f-8dd5-86bc130d251f-kube-api-access-kh7fd\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.010829 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.231558 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.232685 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.236411 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.236451 5094 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.565105 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"1b2421e1-8243-473f-8dd5-86bc130d251f","Type":"ContainerDied","Data":"3235e0f9a591cd9fc84d38dd78cd295e9068409f0e216b0a40d76217b8e522fa"} Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.565357 5094 scope.go:117] "RemoveContainer" containerID="df33dc2485d8aea35d1d004dfc1ba0cd8d779ec59ff0b6fd5dcd53a2dfe85422" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.565465 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.573640 5094 generic.go:334] "Generic (PLEG): container finished" podID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerID="e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58" exitCode=0 Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.573874 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca","Type":"ContainerDied","Data":"e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58"} Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.610667 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.627692 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.656900 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.657621 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657638 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.657657 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="dnsmasq-dns" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657665 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="dnsmasq-dns" Feb 20 08:47:12 crc kubenswrapper[5094]: E0220 08:47:12.657677 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="init" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657682 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="init" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657963 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b88f14c-752a-4565-848b-8fb7820295db" containerName="dnsmasq-dns" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.657977 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" containerName="nova-cell1-novncproxy-novncproxy" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.658630 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.661669 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.668515 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.708836 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.708886 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hschv\" (UniqueName: \"kubernetes.io/projected/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-kube-api-access-hschv\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.708961 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.810839 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.811069 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hschv\" (UniqueName: \"kubernetes.io/projected/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-kube-api-access-hschv\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.811672 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.816469 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.820069 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.835242 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hschv\" (UniqueName: \"kubernetes.io/projected/deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb-kube-api-access-hschv\") pod \"nova-cell1-novncproxy-0\" (UID: \"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb\") " pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.917186 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:12 crc kubenswrapper[5094]: I0220 08:47:12.977714 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.015745 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") pod \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.016221 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") pod \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.016482 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") pod \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\" (UID: \"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca\") " Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.023117 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4" (OuterVolumeSpecName: "kube-api-access-c49n4") pod "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" (UID: "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca"). InnerVolumeSpecName "kube-api-access-c49n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.044199 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" (UID: "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.057344 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data" (OuterVolumeSpecName: "config-data") pod "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" (UID: "feb6d550-160b-45ff-a4e3-f12bf0e0a4ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.119223 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.119253 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.119266 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c49n4\" (UniqueName: \"kubernetes.io/projected/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca-kube-api-access-c49n4\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.229603 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.586116 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.586109 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"feb6d550-160b-45ff-a4e3-f12bf0e0a4ca","Type":"ContainerDied","Data":"ed4718ea392d6b6de8bbaf21e72aac2f90b8c262b455dfd4b484e4049f29e229"} Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.586619 5094 scope.go:117] "RemoveContainer" containerID="e554f01421ff77c0561f5176cdeba2c223172b1b97e15c34de574fa303143b58" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.590018 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb","Type":"ContainerStarted","Data":"c0b59c2a463b548710f6fbe279ff8a8274457e652db869cc9b9e3369bcd64626"} Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.590065 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb","Type":"ContainerStarted","Data":"b0244251df9f7a574b074e23929a9ed957444491d4ff2d0dcfbdab8da31068a8"} Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.613954 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.613931173 podStartE2EDuration="1.613931173s" podCreationTimestamp="2026-02-20 08:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:13.613101613 +0000 UTC m=+7248.485728314" watchObservedRunningTime="2026-02-20 08:47:13.613931173 +0000 UTC m=+7248.486557894" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.651508 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.661803 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.674635 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: E0220 08:47:13.675112 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerName="nova-cell0-conductor-conductor" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.675132 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerName="nova-cell0-conductor-conductor" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.675356 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" containerName="nova-cell0-conductor-conductor" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.676262 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.681348 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.684503 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.731564 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.731767 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.731944 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.834848 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.835016 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.835138 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.840314 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.841236 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.853239 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b2421e1-8243-473f-8dd5-86bc130d251f" path="/var/lib/kubelet/pods/1b2421e1-8243-473f-8dd5-86bc130d251f/volumes" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.853804 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb6d550-160b-45ff-a4e3-f12bf0e0a4ca" path="/var/lib/kubelet/pods/feb6d550-160b-45ff-a4e3-f12bf0e0a4ca/volumes" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.854795 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") pod \"nova-cell0-conductor-0\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:13 crc kubenswrapper[5094]: I0220 08:47:13.998275 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.460973 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.552357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") pod \"008eca6a-12d6-40dd-96bb-391428bd27c5\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.552553 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") pod \"008eca6a-12d6-40dd-96bb-391428bd27c5\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.552658 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") pod \"008eca6a-12d6-40dd-96bb-391428bd27c5\" (UID: \"008eca6a-12d6-40dd-96bb-391428bd27c5\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.559845 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9" (OuterVolumeSpecName: "kube-api-access-svnb9") pod "008eca6a-12d6-40dd-96bb-391428bd27c5" (UID: "008eca6a-12d6-40dd-96bb-391428bd27c5"). InnerVolumeSpecName "kube-api-access-svnb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.583425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data" (OuterVolumeSpecName: "config-data") pod "008eca6a-12d6-40dd-96bb-391428bd27c5" (UID: "008eca6a-12d6-40dd-96bb-391428bd27c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.586803 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "008eca6a-12d6-40dd-96bb-391428bd27c5" (UID: "008eca6a-12d6-40dd-96bb-391428bd27c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.604168 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.609787 5094 generic.go:334] "Generic (PLEG): container finished" podID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerID="ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d" exitCode=0 Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.609956 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerDied","Data":"ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d"} Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.649150 5094 generic.go:334] "Generic (PLEG): container finished" podID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerID="c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810" exitCode=0 Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.649235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerDied","Data":"c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810"} Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.651504 5094 generic.go:334] "Generic (PLEG): container finished" podID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerID="a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" exitCode=0 Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.651864 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.652798 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"008eca6a-12d6-40dd-96bb-391428bd27c5","Type":"ContainerDied","Data":"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824"} Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.652838 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"008eca6a-12d6-40dd-96bb-391428bd27c5","Type":"ContainerDied","Data":"0e61e77c80dfdbd6b142c3a986f85177ae881def4dc50810f9959c9b8afee96d"} Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.652865 5094 scope.go:117] "RemoveContainer" containerID="a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.654986 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.655031 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svnb9\" (UniqueName: \"kubernetes.io/projected/008eca6a-12d6-40dd-96bb-391428bd27c5-kube-api-access-svnb9\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.655044 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/008eca6a-12d6-40dd-96bb-391428bd27c5-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.714307 5094 scope.go:117] "RemoveContainer" containerID="a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" Feb 20 08:47:14 crc kubenswrapper[5094]: E0220 08:47:14.715358 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824\": container with ID starting with a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824 not found: ID does not exist" containerID="a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.715523 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824"} err="failed to get container status \"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824\": rpc error: code = NotFound desc = could not find container \"a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824\": container with ID starting with a9a170791955d3710a5e3621cf8bb57829fa932fb69e487f09a56fe32269f824 not found: ID does not exist" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.731917 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.746237 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.753832 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: E0220 08:47:14.754359 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerName="nova-scheduler-scheduler" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.754386 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerName="nova-scheduler-scheduler" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.754748 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" containerName="nova-scheduler-scheduler" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.756873 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.761540 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.769787 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.861765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.873191 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.965773 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") pod \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.965974 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") pod \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") pod \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966107 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") pod \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\" (UID: \"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd\") " Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966619 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966685 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.966771 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.969000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs" (OuterVolumeSpecName: "logs") pod "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" (UID: "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:47:14 crc kubenswrapper[5094]: I0220 08:47:14.975189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw" (OuterVolumeSpecName: "kube-api-access-ttjnw") pod "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" (UID: "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd"). InnerVolumeSpecName "kube-api-access-ttjnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.009654 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data" (OuterVolumeSpecName: "config-data") pod "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" (UID: "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.024088 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" (UID: "ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069338 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") pod \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069434 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") pod \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069496 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") pod \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069573 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") pod \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\" (UID: \"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8\") " Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069886 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.069962 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.070084 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.071421 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.071732 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.071756 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttjnw\" (UniqueName: \"kubernetes.io/projected/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-kube-api-access-ttjnw\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.071772 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.075556 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs" (OuterVolumeSpecName: "logs") pod "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" (UID: "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.078589 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.081312 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv" (OuterVolumeSpecName: "kube-api-access-8d7zv") pod "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" (UID: "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8"). InnerVolumeSpecName "kube-api-access-8d7zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.082611 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.088737 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") pod \"nova-scheduler-0\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.095982 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" (UID: "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.116934 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data" (OuterVolumeSpecName: "config-data") pod "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" (UID: "0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.140456 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.174325 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d7zv\" (UniqueName: \"kubernetes.io/projected/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-kube-api-access-8d7zv\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.174365 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.174378 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.174390 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:15 crc kubenswrapper[5094]: W0220 08:47:15.642047 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80d2f807_a13f_4a1d_93d3_293d1afd6e4c.slice/crio-364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8 WatchSource:0}: Error finding container 364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8: Status 404 returned error can't find the container with id 364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8 Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.657990 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.677577 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8","Type":"ContainerDied","Data":"1d94c5e4163bc96c9ddb5b16f82044a027cfbc06ea952da5d0ad20ac89c00a85"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.677640 5094 scope.go:117] "RemoveContainer" containerID="ba11d9d7cdbecb4617edeb253e7344dc95bff86a042703cc75a176732533812d" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.677823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.698781 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.700245 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd","Type":"ContainerDied","Data":"33ea9e800121c631e21658cce4961167b1aecb67f072d8888e0fe3827be668d0"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.711167 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"02305b70-64d3-46af-876a-f81d73f83cbf","Type":"ContainerStarted","Data":"fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.711210 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"02305b70-64d3-46af-876a-f81d73f83cbf","Type":"ContainerStarted","Data":"b27e92493a93d647968058e4cfe443d6348c867ece948ce72e75c01521bbc434"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.711538 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.736371 5094 scope.go:117] "RemoveContainer" containerID="98e3c706b02a59b45b5bb55556cb058774353d206688f9292aefcfb6aeee1c49" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.739321 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d2f807-a13f-4a1d-93d3-293d1afd6e4c","Type":"ContainerStarted","Data":"364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8"} Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.754920 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.754891847 podStartE2EDuration="2.754891847s" podCreationTimestamp="2026-02-20 08:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:15.735953871 +0000 UTC m=+7250.608580582" watchObservedRunningTime="2026-02-20 08:47:15.754891847 +0000 UTC m=+7250.627518568" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.789919 5094 scope.go:117] "RemoveContainer" containerID="c133aaff7af6bc08d5607f7f1072c13d9db525ce16f2e919cdf821ceb99af810" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.794920 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.821475 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.825346 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.838872 5094 scope.go:117] "RemoveContainer" containerID="ebc900f8a66f90ec54e3f832c463c60f80a44df3cab9a5a642e1198f63755d0f" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.839752 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.839996 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.879982 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="008eca6a-12d6-40dd-96bb-391428bd27c5" path="/var/lib/kubelet/pods/008eca6a-12d6-40dd-96bb-391428bd27c5/volumes" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.881108 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" path="/var/lib/kubelet/pods/ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd/volumes" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.881843 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.882219 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882236 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.882260 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882267 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.882281 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882287 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" Feb 20 08:47:15 crc kubenswrapper[5094]: E0220 08:47:15.882316 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882322 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882510 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-log" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882529 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-log" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882540 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" containerName="nova-metadata-metadata" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.882550 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec1f6d12-8ffe-4cc2-9f1a-12084c61c7dd" containerName="nova-api-api" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.884217 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.884240 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.884346 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.887894 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.920804 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.922917 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.936951 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 08:47:15 crc kubenswrapper[5094]: I0220 08:47:15.939772 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012836 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012888 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012969 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.012986 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.013023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115222 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115286 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115357 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115408 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115431 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115465 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115510 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.115533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.117295 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.119321 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.123341 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.124003 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.135678 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.142466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.150155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") pod \"nova-metadata-0\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.169770 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") pod \"nova-api-0\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.238379 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.274605 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.782159 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d2f807-a13f-4a1d-93d3-293d1afd6e4c","Type":"ContainerStarted","Data":"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f"} Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.792589 5094 generic.go:334] "Generic (PLEG): container finished" podID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" exitCode=0 Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.792681 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4bd3b441-92b9-4fd4-8451-dec1c354915e","Type":"ContainerDied","Data":"43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d"} Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.903370 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.921549 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.921528777 podStartE2EDuration="2.921528777s" podCreationTimestamp="2026-02-20 08:47:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:16.816853149 +0000 UTC m=+7251.689479860" watchObservedRunningTime="2026-02-20 08:47:16.921528777 +0000 UTC m=+7251.794155488" Feb 20 08:47:16 crc kubenswrapper[5094]: I0220 08:47:16.931177 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.038947 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") pod \"4bd3b441-92b9-4fd4-8451-dec1c354915e\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.041175 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") pod \"4bd3b441-92b9-4fd4-8451-dec1c354915e\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.041396 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") pod \"4bd3b441-92b9-4fd4-8451-dec1c354915e\" (UID: \"4bd3b441-92b9-4fd4-8451-dec1c354915e\") " Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.061338 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc" (OuterVolumeSpecName: "kube-api-access-7mcnc") pod "4bd3b441-92b9-4fd4-8451-dec1c354915e" (UID: "4bd3b441-92b9-4fd4-8451-dec1c354915e"). InnerVolumeSpecName "kube-api-access-7mcnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.070665 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.080172 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data" (OuterVolumeSpecName: "config-data") pod "4bd3b441-92b9-4fd4-8451-dec1c354915e" (UID: "4bd3b441-92b9-4fd4-8451-dec1c354915e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.089574 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bd3b441-92b9-4fd4-8451-dec1c354915e" (UID: "4bd3b441-92b9-4fd4-8451-dec1c354915e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.144627 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mcnc\" (UniqueName: \"kubernetes.io/projected/4bd3b441-92b9-4fd4-8451-dec1c354915e-kube-api-access-7mcnc\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.144683 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.144698 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bd3b441-92b9-4fd4-8451-dec1c354915e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.822825 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerStarted","Data":"c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.823139 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerStarted","Data":"39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.823149 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerStarted","Data":"867e153f6129e6c09f4b4a68b08d0ac6938b5f39543d1e08a62fd3fdae93737c"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.825372 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerStarted","Data":"25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.825394 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerStarted","Data":"c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.825402 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerStarted","Data":"d585671e2fde2c389818c568ec8f701d1f0c341b00acbfaa339458d079916a62"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.827612 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.835226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"4bd3b441-92b9-4fd4-8451-dec1c354915e","Type":"ContainerDied","Data":"05c39de3bac79bc7ca3ed4fb07b709e2d6f8ab1a2eb157f4f36bbc1df541309c"} Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.835334 5094 scope.go:117] "RemoveContainer" containerID="43a32182f09efed60387dd69fdb95acc9d172d5439401fc1ab4dd9e997f9483d" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.850222 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8" path="/var/lib/kubelet/pods/0bce5a1d-755b-4c0a-b9aa-1fce8cbfc8b8/volumes" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.872039 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.872016138 podStartE2EDuration="2.872016138s" podCreationTimestamp="2026-02-20 08:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:17.864821055 +0000 UTC m=+7252.737447766" watchObservedRunningTime="2026-02-20 08:47:17.872016138 +0000 UTC m=+7252.744642849" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.894100 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.894070819 podStartE2EDuration="2.894070819s" podCreationTimestamp="2026-02-20 08:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:17.890621696 +0000 UTC m=+7252.763248427" watchObservedRunningTime="2026-02-20 08:47:17.894070819 +0000 UTC m=+7252.766697540" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.925215 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.943691 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.954910 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: E0220 08:47:17.955338 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.955356 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.955552 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" containerName="nova-cell1-conductor-conductor" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.956332 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.959499 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.965883 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:17 crc kubenswrapper[5094]: I0220 08:47:17.978247 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.063023 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.063071 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.063146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.166267 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.166751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.166966 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.175854 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.185327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.201223 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.286548 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:18 crc kubenswrapper[5094]: W0220 08:47:18.796940 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec04fa38_0d41_4c78_99fd_56299cd1c5ac.slice/crio-205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899 WatchSource:0}: Error finding container 205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899: Status 404 returned error can't find the container with id 205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899 Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.800692 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 08:47:18 crc kubenswrapper[5094]: I0220 08:47:18.841636 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec04fa38-0d41-4c78-99fd-56299cd1c5ac","Type":"ContainerStarted","Data":"205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899"} Feb 20 08:47:19 crc kubenswrapper[5094]: I0220 08:47:19.855855 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bd3b441-92b9-4fd4-8451-dec1c354915e" path="/var/lib/kubelet/pods/4bd3b441-92b9-4fd4-8451-dec1c354915e/volumes" Feb 20 08:47:19 crc kubenswrapper[5094]: I0220 08:47:19.857051 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec04fa38-0d41-4c78-99fd-56299cd1c5ac","Type":"ContainerStarted","Data":"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8"} Feb 20 08:47:19 crc kubenswrapper[5094]: I0220 08:47:19.857115 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:19 crc kubenswrapper[5094]: I0220 08:47:19.905579 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.905550778 podStartE2EDuration="2.905550778s" podCreationTimestamp="2026-02-20 08:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:19.875582138 +0000 UTC m=+7254.748208849" watchObservedRunningTime="2026-02-20 08:47:19.905550778 +0000 UTC m=+7254.778177489" Feb 20 08:47:20 crc kubenswrapper[5094]: I0220 08:47:20.141888 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 08:47:21 crc kubenswrapper[5094]: I0220 08:47:21.275489 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:47:21 crc kubenswrapper[5094]: I0220 08:47:21.275559 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 08:47:22 crc kubenswrapper[5094]: I0220 08:47:22.979142 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:23 crc kubenswrapper[5094]: I0220 08:47:23.002539 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:23 crc kubenswrapper[5094]: I0220 08:47:23.901162 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 20 08:47:24 crc kubenswrapper[5094]: I0220 08:47:24.035381 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 08:47:25 crc kubenswrapper[5094]: I0220 08:47:25.141556 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 08:47:25 crc kubenswrapper[5094]: I0220 08:47:25.178344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 08:47:25 crc kubenswrapper[5094]: I0220 08:47:25.939234 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 08:47:26 crc kubenswrapper[5094]: I0220 08:47:26.238992 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:47:26 crc kubenswrapper[5094]: I0220 08:47:26.239788 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 08:47:26 crc kubenswrapper[5094]: I0220 08:47:26.275814 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:47:26 crc kubenswrapper[5094]: I0220 08:47:26.275866 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.403079 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.403100 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.403120 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.86:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.403360 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:27 crc kubenswrapper[5094]: I0220 08:47:27.840584 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:27 crc kubenswrapper[5094]: E0220 08:47:27.840831 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:28 crc kubenswrapper[5094]: I0220 08:47:28.325354 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.507719 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.509678 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.511779 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.533496 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631518 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631560 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631604 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631828 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631919 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.631974 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733520 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733620 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733658 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.733963 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.734496 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.738895 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.739614 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.740014 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.742545 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.757786 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") pod \"cinder-scheduler-0\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:34 crc kubenswrapper[5094]: I0220 08:47:34.828979 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:35 crc kubenswrapper[5094]: I0220 08:47:35.301349 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:35 crc kubenswrapper[5094]: I0220 08:47:35.996005 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerStarted","Data":"96e6715e90a0b7ca9a4f361deda20ad4a5f2e1b7bb2403e2af62f561294f2544"} Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.248946 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.250975 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.260073 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.277202 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.286998 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.288089 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.296106 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.385270 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.386019 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api-log" containerID="cri-o://ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" gracePeriod=30 Feb 20 08:47:36 crc kubenswrapper[5094]: I0220 08:47:36.386118 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" containerID="cri-o://c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" gracePeriod=30 Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.006814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerStarted","Data":"6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123"} Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.007139 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerStarted","Data":"559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2"} Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.010366 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerID="ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" exitCode=143 Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.010452 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerDied","Data":"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a"} Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.010825 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.012900 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.022282 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.024322 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.024665 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.026136 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.037909 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.045364 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.7583919789999998 podStartE2EDuration="3.045343801s" podCreationTimestamp="2026-02-20 08:47:34 +0000 UTC" firstStartedPulling="2026-02-20 08:47:35.302395319 +0000 UTC m=+7270.175022030" lastFinishedPulling="2026-02-20 08:47:35.589347141 +0000 UTC m=+7270.461973852" observedRunningTime="2026-02-20 08:47:37.038178269 +0000 UTC m=+7271.910804980" watchObservedRunningTime="2026-02-20 08:47:37.045343801 +0000 UTC m=+7271.917970512" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085309 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-run\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085365 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-dev\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085400 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085418 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085464 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085486 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085595 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmjg\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-kube-api-access-7rmjg\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085687 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085742 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085796 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085891 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.085994 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.086021 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-sys\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187567 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187615 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187637 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187660 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187775 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-sys\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187805 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-run\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187832 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187852 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-dev\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187869 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187913 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187938 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187959 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.187975 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rmjg\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-kube-api-access-7rmjg\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.188669 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.188888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.188940 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189030 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189077 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-sys\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189112 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-run\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189146 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.189251 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-dev\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.188667 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/571a6098-6e30-438f-a6a9-fb751a79ca27-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.194446 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.195604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.208346 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rmjg\" (UniqueName: \"kubernetes.io/projected/571a6098-6e30-438f-a6a9-fb751a79ca27-kube-api-access-7rmjg\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.208736 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.208754 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.223638 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/571a6098-6e30-438f-a6a9-fb751a79ca27-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"571a6098-6e30-438f-a6a9-fb751a79ca27\") " pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.342792 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.695977 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.697940 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.705224 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.721123 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.799883 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.799942 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-sys\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.799970 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.799999 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800021 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800042 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-ceph\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800060 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-dev\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800075 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800127 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-run\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800181 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800315 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-scripts\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800557 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-lib-modules\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.800589 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccnws\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-kube-api-access-ccnws\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.867115 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902550 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902602 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-lib-modules\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902660 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccnws\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-kube-api-access-ccnws\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902738 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902763 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902787 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-sys\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.902807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-lib-modules\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903451 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903457 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-sys\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903503 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903783 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903854 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903893 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-ceph\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903918 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-dev\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903942 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903970 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-run\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.903993 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.904077 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.904124 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-scripts\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907528 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-run\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907575 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-dev\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907610 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907635 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907665 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-nvme\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.907984 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d7f13f97-3504-4faa-a8cf-8ad4a7973623-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.908533 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-ceph\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.910335 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.910411 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-config-data-custom\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.910973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.912465 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d7f13f97-3504-4faa-a8cf-8ad4a7973623-scripts\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:37 crc kubenswrapper[5094]: I0220 08:47:37.920809 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccnws\" (UniqueName: \"kubernetes.io/projected/d7f13f97-3504-4faa-a8cf-8ad4a7973623-kube-api-access-ccnws\") pod \"cinder-backup-0\" (UID: \"d7f13f97-3504-4faa-a8cf-8ad4a7973623\") " pod="openstack/cinder-backup-0" Feb 20 08:47:38 crc kubenswrapper[5094]: I0220 08:47:38.021634 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"571a6098-6e30-438f-a6a9-fb751a79ca27","Type":"ContainerStarted","Data":"aeb59c1588eed2cf17b2842288e5805871d6e404b09670ae490547f5c37d6bbf"} Feb 20 08:47:38 crc kubenswrapper[5094]: I0220 08:47:38.039009 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Feb 20 08:47:38 crc kubenswrapper[5094]: I0220 08:47:38.576348 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.034910 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"571a6098-6e30-438f-a6a9-fb751a79ca27","Type":"ContainerStarted","Data":"263654ba6b99106d51044c8bcfb147076ab69754ce97751c1bb0923e1c8d7582"} Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.035379 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"571a6098-6e30-438f-a6a9-fb751a79ca27","Type":"ContainerStarted","Data":"5af98cbef9ac7a3e0e5bae1f39cad5d6cabea6e45418f4b771bfc377fb91dd98"} Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.040317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d7f13f97-3504-4faa-a8cf-8ad4a7973623","Type":"ContainerStarted","Data":"c189d74fd0a1b93bdf0acee366dbe356f8f5f161057f579d8366a8f622988683"} Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.061937 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.629014001 podStartE2EDuration="3.061912173s" podCreationTimestamp="2026-02-20 08:47:36 +0000 UTC" firstStartedPulling="2026-02-20 08:47:37.882441435 +0000 UTC m=+7272.755068146" lastFinishedPulling="2026-02-20 08:47:38.315339607 +0000 UTC m=+7273.187966318" observedRunningTime="2026-02-20 08:47:39.05430692 +0000 UTC m=+7273.926933651" watchObservedRunningTime="2026-02-20 08:47:39.061912173 +0000 UTC m=+7273.934538884" Feb 20 08:47:39 crc kubenswrapper[5094]: I0220 08:47:39.829913 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.003333 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.051143 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d7f13f97-3504-4faa-a8cf-8ad4a7973623","Type":"ContainerStarted","Data":"e8645feb6916fb733f295972fe628c533a4264269a03a3fa0d4febf7afa90ed8"} Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.051187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"d7f13f97-3504-4faa-a8cf-8ad4a7973623","Type":"ContainerStarted","Data":"b150727e7d44034a400e262ea8fb30b0e4ebceb9cca7b3a084a18f2176c6797f"} Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.054206 5094 generic.go:334] "Generic (PLEG): container finished" podID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerID="c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" exitCode=0 Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.054831 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.055000 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerDied","Data":"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513"} Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.055026 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a7799fdb-4c3c-4792-be6d-f988852a6dad","Type":"ContainerDied","Data":"0083b149ecde6dd092d89d6192562bc09a0f773314ca1e2250b77701f210c2ad"} Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.055042 5094 scope.go:117] "RemoveContainer" containerID="c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.095957 5094 scope.go:117] "RemoveContainer" containerID="ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.099944 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.781029 podStartE2EDuration="3.099924409s" podCreationTimestamp="2026-02-20 08:47:37 +0000 UTC" firstStartedPulling="2026-02-20 08:47:38.588165299 +0000 UTC m=+7273.460792010" lastFinishedPulling="2026-02-20 08:47:38.907060708 +0000 UTC m=+7273.779687419" observedRunningTime="2026-02-20 08:47:40.080148753 +0000 UTC m=+7274.952775464" watchObservedRunningTime="2026-02-20 08:47:40.099924409 +0000 UTC m=+7274.972551120" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.120987 5094 scope.go:117] "RemoveContainer" containerID="c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" Feb 20 08:47:40 crc kubenswrapper[5094]: E0220 08:47:40.126345 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513\": container with ID starting with c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513 not found: ID does not exist" containerID="c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.126976 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513"} err="failed to get container status \"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513\": rpc error: code = NotFound desc = could not find container \"c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513\": container with ID starting with c6cbdccc1408b1467e78e30844e9e20cc8ecf13baefc42e4d8d177a2ad9ea513 not found: ID does not exist" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.127002 5094 scope.go:117] "RemoveContainer" containerID="ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" Feb 20 08:47:40 crc kubenswrapper[5094]: E0220 08:47:40.127492 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a\": container with ID starting with ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a not found: ID does not exist" containerID="ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.127533 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a"} err="failed to get container status \"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a\": rpc error: code = NotFound desc = could not find container \"ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a\": container with ID starting with ac9a8a209c41aa38d1afaa5649246b4c179ca53d98b2ce40bde25654f0c08e0a not found: ID does not exist" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.145243 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.145354 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.145374 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146434 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146462 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146509 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146571 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") pod \"a7799fdb-4c3c-4792-be6d-f988852a6dad\" (UID: \"a7799fdb-4c3c-4792-be6d-f988852a6dad\") " Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.146033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs" (OuterVolumeSpecName: "logs") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.147629 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.151317 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts" (OuterVolumeSpecName: "scripts") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.153074 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4" (OuterVolumeSpecName: "kube-api-access-xdwt4") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "kube-api-access-xdwt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.157627 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.195472 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.207095 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data" (OuterVolumeSpecName: "config-data") pod "a7799fdb-4c3c-4792-be6d-f988852a6dad" (UID: "a7799fdb-4c3c-4792-be6d-f988852a6dad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249485 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249646 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7799fdb-4c3c-4792-be6d-f988852a6dad-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249676 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdwt4\" (UniqueName: \"kubernetes.io/projected/a7799fdb-4c3c-4792-be6d-f988852a6dad-kube-api-access-xdwt4\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249752 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249778 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249814 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7799fdb-4c3c-4792-be6d-f988852a6dad-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.249827 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7799fdb-4c3c-4792-be6d-f988852a6dad-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.388352 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.401247 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.415817 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:40 crc kubenswrapper[5094]: E0220 08:47:40.416299 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.416326 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" Feb 20 08:47:40 crc kubenswrapper[5094]: E0220 08:47:40.416438 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api-log" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.416528 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api-log" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.417318 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.417378 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api-log" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.418603 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.422548 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454323 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454420 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-scripts\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454539 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454591 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454624 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8551a6-6aac-4c12-b3ce-913397a5316f-logs\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454651 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n5mc\" (UniqueName: \"kubernetes.io/projected/3b8551a6-6aac-4c12-b3ce-913397a5316f-kube-api-access-4n5mc\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.454684 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b8551a6-6aac-4c12-b3ce-913397a5316f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.455046 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556062 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556112 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556138 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8551a6-6aac-4c12-b3ce-913397a5316f-logs\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556156 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n5mc\" (UniqueName: \"kubernetes.io/projected/3b8551a6-6aac-4c12-b3ce-913397a5316f-kube-api-access-4n5mc\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556177 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b8551a6-6aac-4c12-b3ce-913397a5316f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556208 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.556251 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-scripts\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.557015 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3b8551a6-6aac-4c12-b3ce-913397a5316f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.557596 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b8551a6-6aac-4c12-b3ce-913397a5316f-logs\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.561322 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data-custom\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.561468 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.562142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-config-data\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.566424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b8551a6-6aac-4c12-b3ce-913397a5316f-scripts\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.576608 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n5mc\" (UniqueName: \"kubernetes.io/projected/3b8551a6-6aac-4c12-b3ce-913397a5316f-kube-api-access-4n5mc\") pod \"cinder-api-0\" (UID: \"3b8551a6-6aac-4c12-b3ce-913397a5316f\") " pod="openstack/cinder-api-0" Feb 20 08:47:40 crc kubenswrapper[5094]: I0220 08:47:40.751491 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 20 08:47:41 crc kubenswrapper[5094]: W0220 08:47:41.181101 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b8551a6_6aac_4c12_b3ce_913397a5316f.slice/crio-a80cdb5ee2bf9399b06e128091592f8303d539cda325cebf2ee2498728645ce1 WatchSource:0}: Error finding container a80cdb5ee2bf9399b06e128091592f8303d539cda325cebf2ee2498728645ce1: Status 404 returned error can't find the container with id a80cdb5ee2bf9399b06e128091592f8303d539cda325cebf2ee2498728645ce1 Feb 20 08:47:41 crc kubenswrapper[5094]: I0220 08:47:41.192765 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 20 08:47:41 crc kubenswrapper[5094]: I0220 08:47:41.841293 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:41 crc kubenswrapper[5094]: E0220 08:47:41.841857 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:41 crc kubenswrapper[5094]: I0220 08:47:41.861496 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" path="/var/lib/kubelet/pods/a7799fdb-4c3c-4792-be6d-f988852a6dad/volumes" Feb 20 08:47:42 crc kubenswrapper[5094]: I0220 08:47:42.074999 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b8551a6-6aac-4c12-b3ce-913397a5316f","Type":"ContainerStarted","Data":"848984c5cec08fc4ac1f4f8a3a4487b094f0af4ff916255c85d282443bbbf29a"} Feb 20 08:47:42 crc kubenswrapper[5094]: I0220 08:47:42.075056 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b8551a6-6aac-4c12-b3ce-913397a5316f","Type":"ContainerStarted","Data":"a80cdb5ee2bf9399b06e128091592f8303d539cda325cebf2ee2498728645ce1"} Feb 20 08:47:42 crc kubenswrapper[5094]: I0220 08:47:42.343101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:43 crc kubenswrapper[5094]: I0220 08:47:43.040196 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Feb 20 08:47:43 crc kubenswrapper[5094]: I0220 08:47:43.084272 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3b8551a6-6aac-4c12-b3ce-913397a5316f","Type":"ContainerStarted","Data":"3f045c8da32a76424886c896e6d99cc15a91c8738009ed815181fed0d46683e4"} Feb 20 08:47:43 crc kubenswrapper[5094]: I0220 08:47:43.084453 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 20 08:47:43 crc kubenswrapper[5094]: I0220 08:47:43.106852 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.106830381 podStartE2EDuration="3.106830381s" podCreationTimestamp="2026-02-20 08:47:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:43.105147991 +0000 UTC m=+7277.977774702" watchObservedRunningTime="2026-02-20 08:47:43.106830381 +0000 UTC m=+7277.979457102" Feb 20 08:47:44 crc kubenswrapper[5094]: I0220 08:47:44.870290 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="a7799fdb-4c3c-4792-be6d-f988852a6dad" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.82:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 08:47:45 crc kubenswrapper[5094]: I0220 08:47:45.018278 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 08:47:45 crc kubenswrapper[5094]: I0220 08:47:45.073053 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:45 crc kubenswrapper[5094]: I0220 08:47:45.105848 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="cinder-scheduler" containerID="cri-o://559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2" gracePeriod=30 Feb 20 08:47:45 crc kubenswrapper[5094]: I0220 08:47:45.105955 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="probe" containerID="cri-o://6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123" gracePeriod=30 Feb 20 08:47:46 crc kubenswrapper[5094]: I0220 08:47:46.120669 5094 generic.go:334] "Generic (PLEG): container finished" podID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerID="6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123" exitCode=0 Feb 20 08:47:46 crc kubenswrapper[5094]: I0220 08:47:46.120748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerDied","Data":"6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123"} Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.143146 5094 generic.go:334] "Generic (PLEG): container finished" podID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerID="559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2" exitCode=0 Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.143229 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerDied","Data":"559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2"} Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.495260 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.576787 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.608827 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609255 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609370 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609373 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609416 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.609496 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") pod \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\" (UID: \"7cb209d2-d0d5-41b3-a452-ffe3fd846798\") " Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.610046 5094 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7cb209d2-d0d5-41b3-a452-ffe3fd846798-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.625393 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.625425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv" (OuterVolumeSpecName: "kube-api-access-k6zwv") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "kube-api-access-k6zwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.636920 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts" (OuterVolumeSpecName: "scripts") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.666688 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.712263 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6zwv\" (UniqueName: \"kubernetes.io/projected/7cb209d2-d0d5-41b3-a452-ffe3fd846798-kube-api-access-k6zwv\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.712302 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.712315 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.712329 5094 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.715450 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data" (OuterVolumeSpecName: "config-data") pod "7cb209d2-d0d5-41b3-a452-ffe3fd846798" (UID: "7cb209d2-d0d5-41b3-a452-ffe3fd846798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:47:47 crc kubenswrapper[5094]: I0220 08:47:47.813641 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cb209d2-d0d5-41b3-a452-ffe3fd846798-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.156640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7cb209d2-d0d5-41b3-a452-ffe3fd846798","Type":"ContainerDied","Data":"96e6715e90a0b7ca9a4f361deda20ad4a5f2e1b7bb2403e2af62f561294f2544"} Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.156696 5094 scope.go:117] "RemoveContainer" containerID="6fea4c135338d1a075521aee446680ad85fdc4ca1be3734940b421558747f123" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.156786 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.185790 5094 scope.go:117] "RemoveContainer" containerID="559602b89f51921b2a84d2e61fb334636cf26303ca35d00060fdb477994544a2" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.204903 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.219132 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.235837 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:48 crc kubenswrapper[5094]: E0220 08:47:48.236308 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="probe" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.236328 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="probe" Feb 20 08:47:48 crc kubenswrapper[5094]: E0220 08:47:48.236351 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="cinder-scheduler" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.236358 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="cinder-scheduler" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.236523 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="cinder-scheduler" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.236543 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" containerName="probe" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.237571 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.240041 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.253233 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.309307 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.322419 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.322792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88808044-5011-40de-9088-154284495e1a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.322831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.322866 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzzm\" (UniqueName: \"kubernetes.io/projected/88808044-5011-40de-9088-154284495e1a-kube-api-access-mjzzm\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.323003 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-scripts\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.323310 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425466 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425621 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88808044-5011-40de-9088-154284495e1a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425665 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzzm\" (UniqueName: \"kubernetes.io/projected/88808044-5011-40de-9088-154284495e1a-kube-api-access-mjzzm\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.425693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-scripts\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.426370 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/88808044-5011-40de-9088-154284495e1a-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.430455 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.433758 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.436495 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-scripts\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.436561 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/88808044-5011-40de-9088-154284495e1a-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.444101 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzzm\" (UniqueName: \"kubernetes.io/projected/88808044-5011-40de-9088-154284495e1a-kube-api-access-mjzzm\") pod \"cinder-scheduler-0\" (UID: \"88808044-5011-40de-9088-154284495e1a\") " pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.563293 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 20 08:47:48 crc kubenswrapper[5094]: I0220 08:47:48.944971 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 20 08:47:49 crc kubenswrapper[5094]: I0220 08:47:49.168215 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88808044-5011-40de-9088-154284495e1a","Type":"ContainerStarted","Data":"702907a7643fffb7875b919f007bb005ff748c49ecad63e3eb1db73d3953e8f0"} Feb 20 08:47:49 crc kubenswrapper[5094]: I0220 08:47:49.854891 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cb209d2-d0d5-41b3-a452-ffe3fd846798" path="/var/lib/kubelet/pods/7cb209d2-d0d5-41b3-a452-ffe3fd846798/volumes" Feb 20 08:47:50 crc kubenswrapper[5094]: I0220 08:47:50.178626 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88808044-5011-40de-9088-154284495e1a","Type":"ContainerStarted","Data":"331e7f6fd087287847d0160f47f187ac0aa2f7b50a0f8c6e674fc48ad5b6acf5"} Feb 20 08:47:50 crc kubenswrapper[5094]: I0220 08:47:50.179019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"88808044-5011-40de-9088-154284495e1a","Type":"ContainerStarted","Data":"ebba4ba9a94e19f6e1606438b8296ad93fc0dbb17dc965adce0aa681a076f877"} Feb 20 08:47:50 crc kubenswrapper[5094]: I0220 08:47:50.199542 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.199526044 podStartE2EDuration="2.199526044s" podCreationTimestamp="2026-02-20 08:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:47:50.198032868 +0000 UTC m=+7285.070659579" watchObservedRunningTime="2026-02-20 08:47:50.199526044 +0000 UTC m=+7285.072152755" Feb 20 08:47:52 crc kubenswrapper[5094]: I0220 08:47:52.496257 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 20 08:47:53 crc kubenswrapper[5094]: I0220 08:47:53.564243 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 20 08:47:53 crc kubenswrapper[5094]: I0220 08:47:53.840926 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:47:53 crc kubenswrapper[5094]: E0220 08:47:53.841475 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:47:58 crc kubenswrapper[5094]: I0220 08:47:58.853943 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 20 08:48:04 crc kubenswrapper[5094]: I0220 08:48:04.840029 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:48:04 crc kubenswrapper[5094]: E0220 08:48:04.840817 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:48:19 crc kubenswrapper[5094]: I0220 08:48:19.841430 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:48:19 crc kubenswrapper[5094]: E0220 08:48:19.842234 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.084049 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.086998 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.127913 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.190391 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.190679 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.190809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.291878 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.291990 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.292048 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.292562 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.292592 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.324645 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") pod \"redhat-marketplace-bhj65\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.409340 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:28 crc kubenswrapper[5094]: I0220 08:48:28.883608 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:28 crc kubenswrapper[5094]: W0220 08:48:28.885264 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc1476cd_037c_4974_ac5b_7b914e175b0c.slice/crio-634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6 WatchSource:0}: Error finding container 634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6: Status 404 returned error can't find the container with id 634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6 Feb 20 08:48:29 crc kubenswrapper[5094]: I0220 08:48:29.577105 5094 generic.go:334] "Generic (PLEG): container finished" podID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerID="d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68" exitCode=0 Feb 20 08:48:29 crc kubenswrapper[5094]: I0220 08:48:29.577201 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerDied","Data":"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68"} Feb 20 08:48:29 crc kubenswrapper[5094]: I0220 08:48:29.577523 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerStarted","Data":"634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6"} Feb 20 08:48:29 crc kubenswrapper[5094]: I0220 08:48:29.579857 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:48:30 crc kubenswrapper[5094]: I0220 08:48:30.590127 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerStarted","Data":"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba"} Feb 20 08:48:31 crc kubenswrapper[5094]: I0220 08:48:31.599645 5094 generic.go:334] "Generic (PLEG): container finished" podID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerID="c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba" exitCode=0 Feb 20 08:48:31 crc kubenswrapper[5094]: I0220 08:48:31.599687 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerDied","Data":"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba"} Feb 20 08:48:32 crc kubenswrapper[5094]: I0220 08:48:32.615307 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerStarted","Data":"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3"} Feb 20 08:48:32 crc kubenswrapper[5094]: I0220 08:48:32.641113 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bhj65" podStartSLOduration=2.154948426 podStartE2EDuration="4.641072942s" podCreationTimestamp="2026-02-20 08:48:28 +0000 UTC" firstStartedPulling="2026-02-20 08:48:29.579389523 +0000 UTC m=+7324.452016264" lastFinishedPulling="2026-02-20 08:48:32.065514059 +0000 UTC m=+7326.938140780" observedRunningTime="2026-02-20 08:48:32.635718253 +0000 UTC m=+7327.508344964" watchObservedRunningTime="2026-02-20 08:48:32.641072942 +0000 UTC m=+7327.513699703" Feb 20 08:48:32 crc kubenswrapper[5094]: I0220 08:48:32.841181 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:48:32 crc kubenswrapper[5094]: E0220 08:48:32.842056 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.410546 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.411308 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.479380 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.719803 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:38 crc kubenswrapper[5094]: I0220 08:48:38.768138 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:40 crc kubenswrapper[5094]: I0220 08:48:40.691797 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bhj65" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="registry-server" containerID="cri-o://4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" gracePeriod=2 Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.178675 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.247369 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") pod \"dc1476cd-037c-4974-ac5b-7b914e175b0c\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.247472 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") pod \"dc1476cd-037c-4974-ac5b-7b914e175b0c\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.247521 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") pod \"dc1476cd-037c-4974-ac5b-7b914e175b0c\" (UID: \"dc1476cd-037c-4974-ac5b-7b914e175b0c\") " Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.249136 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities" (OuterVolumeSpecName: "utilities") pod "dc1476cd-037c-4974-ac5b-7b914e175b0c" (UID: "dc1476cd-037c-4974-ac5b-7b914e175b0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.263002 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql" (OuterVolumeSpecName: "kube-api-access-jz7ql") pod "dc1476cd-037c-4974-ac5b-7b914e175b0c" (UID: "dc1476cd-037c-4974-ac5b-7b914e175b0c"). InnerVolumeSpecName "kube-api-access-jz7ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.274155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc1476cd-037c-4974-ac5b-7b914e175b0c" (UID: "dc1476cd-037c-4974-ac5b-7b914e175b0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.350235 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz7ql\" (UniqueName: \"kubernetes.io/projected/dc1476cd-037c-4974-ac5b-7b914e175b0c-kube-api-access-jz7ql\") on node \"crc\" DevicePath \"\"" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.350527 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.350541 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc1476cd-037c-4974-ac5b-7b914e175b0c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701780 5094 generic.go:334] "Generic (PLEG): container finished" podID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerID="4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" exitCode=0 Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701824 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerDied","Data":"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3"} Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701853 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bhj65" event={"ID":"dc1476cd-037c-4974-ac5b-7b914e175b0c","Type":"ContainerDied","Data":"634742879a6c4ad93a0d5b994644cd639eb638b5c2df868afc86c0ba7e9263d6"} Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701848 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bhj65" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.701867 5094 scope.go:117] "RemoveContainer" containerID="4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.726048 5094 scope.go:117] "RemoveContainer" containerID="c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.732889 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.759027 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bhj65"] Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.776476 5094 scope.go:117] "RemoveContainer" containerID="d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.815164 5094 scope.go:117] "RemoveContainer" containerID="4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" Feb 20 08:48:41 crc kubenswrapper[5094]: E0220 08:48:41.816148 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3\": container with ID starting with 4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3 not found: ID does not exist" containerID="4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.816183 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3"} err="failed to get container status \"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3\": rpc error: code = NotFound desc = could not find container \"4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3\": container with ID starting with 4e748da98a0207a7e26602bc55b18b9204208e287bb9430cf800cbc8294b37f3 not found: ID does not exist" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.816205 5094 scope.go:117] "RemoveContainer" containerID="c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba" Feb 20 08:48:41 crc kubenswrapper[5094]: E0220 08:48:41.818660 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba\": container with ID starting with c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba not found: ID does not exist" containerID="c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.818689 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba"} err="failed to get container status \"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba\": rpc error: code = NotFound desc = could not find container \"c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba\": container with ID starting with c06c2015233b69995c50a06fa09831ff1aea6f2254ff515b4a70c854fdaab0ba not found: ID does not exist" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.818718 5094 scope.go:117] "RemoveContainer" containerID="d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68" Feb 20 08:48:41 crc kubenswrapper[5094]: E0220 08:48:41.823817 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68\": container with ID starting with d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68 not found: ID does not exist" containerID="d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.823862 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68"} err="failed to get container status \"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68\": rpc error: code = NotFound desc = could not find container \"d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68\": container with ID starting with d4dbd3d88d452cfbf96b4b013b71fabf22ef758279d3fbebd557e66db2963d68 not found: ID does not exist" Feb 20 08:48:41 crc kubenswrapper[5094]: I0220 08:48:41.867360 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" path="/var/lib/kubelet/pods/dc1476cd-037c-4974-ac5b-7b914e175b0c/volumes" Feb 20 08:48:42 crc kubenswrapper[5094]: I0220 08:48:42.037924 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:48:42 crc kubenswrapper[5094]: I0220 08:48:42.047720 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:48:42 crc kubenswrapper[5094]: I0220 08:48:42.054805 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-l2kgb"] Feb 20 08:48:42 crc kubenswrapper[5094]: I0220 08:48:42.062478 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-5af6-account-create-update-9f77r"] Feb 20 08:48:43 crc kubenswrapper[5094]: I0220 08:48:43.854321 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d1678de-0344-47d5-98bb-d9ffd63912e7" path="/var/lib/kubelet/pods/1d1678de-0344-47d5-98bb-d9ffd63912e7/volumes" Feb 20 08:48:43 crc kubenswrapper[5094]: I0220 08:48:43.855614 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec59a7fc-e360-4e39-8c57-cfaa43d23566" path="/var/lib/kubelet/pods/ec59a7fc-e360-4e39-8c57-cfaa43d23566/volumes" Feb 20 08:48:46 crc kubenswrapper[5094]: I0220 08:48:46.840392 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:48:47 crc kubenswrapper[5094]: I0220 08:48:47.765974 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268"} Feb 20 08:48:53 crc kubenswrapper[5094]: I0220 08:48:53.041134 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:48:53 crc kubenswrapper[5094]: I0220 08:48:53.061403 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-bh8cv"] Feb 20 08:48:53 crc kubenswrapper[5094]: I0220 08:48:53.850628 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81601ce5-f2ae-4f57-a829-6b235b7ae4df" path="/var/lib/kubelet/pods/81601ce5-f2ae-4f57-a829-6b235b7ae4df/volumes" Feb 20 08:49:06 crc kubenswrapper[5094]: I0220 08:49:06.057848 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:49:06 crc kubenswrapper[5094]: I0220 08:49:06.071299 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-869rz"] Feb 20 08:49:07 crc kubenswrapper[5094]: I0220 08:49:07.851808 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab956a6-a68a-4da9-9065-6f09fb2a8f28" path="/var/lib/kubelet/pods/5ab956a6-a68a-4da9-9065-6f09fb2a8f28/volumes" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.088639 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:49:31 crc kubenswrapper[5094]: E0220 08:49:31.089771 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="extract-content" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.089784 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="extract-content" Feb 20 08:49:31 crc kubenswrapper[5094]: E0220 08:49:31.089801 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="extract-utilities" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.089808 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="extract-utilities" Feb 20 08:49:31 crc kubenswrapper[5094]: E0220 08:49:31.089824 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="registry-server" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.089829 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="registry-server" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.089987 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc1476cd-037c-4974-ac5b-7b914e175b0c" containerName="registry-server" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.090871 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.093043 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.093208 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.093417 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mpk6z" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.093582 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.109015 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.142401 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.142631 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-log" containerID="cri-o://280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" gracePeriod=30 Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.142748 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-httpd" containerID="cri-o://7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" gracePeriod=30 Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.207780 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.208064 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-log" containerID="cri-o://ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" gracePeriod=30 Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.208238 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-httpd" containerID="cri-o://d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" gracePeriod=30 Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.222778 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.224765 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237674 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237731 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237760 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.237824 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.265331 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.339940 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.340618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.340789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.340897 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.341374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.341481 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.341591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.341943 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.342157 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.342328 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.342434 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.342986 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.343267 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.354231 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.362622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") pod \"horizon-54879cd99c-v7mr5\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.420035 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443582 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443749 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.443766 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.444115 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: E0220 08:49:31.445591 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0babde66_7106_44f9_8108_dc7123e64645.slice/crio-ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673.scope\": RecentStats: unable to find data in memory cache]" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.448496 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.449899 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.459120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.462899 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") pod \"horizon-7cd88dbb9c-w5xqj\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.611432 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.818498 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.869945 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.872028 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.889689 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.923411 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:49:31 crc kubenswrapper[5094]: W0220 08:49:31.971367 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa4baf13_0870_4bf6_9a0b_d4fd1fb598ce.slice/crio-cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba WatchSource:0}: Error finding container cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba: Status 404 returned error can't find the container with id cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba Feb 20 08:49:31 crc kubenswrapper[5094]: I0220 08:49:31.976764 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055195 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055250 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055279 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055333 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.055364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157454 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157509 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157540 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.157631 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.158167 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.158418 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.159195 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.163415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.173021 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") pod \"horizon-6d54b4d569-kqd4s\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.190078 5094 generic.go:334] "Generic (PLEG): container finished" podID="0babde66-7106-44f9-8108-dc7123e64645" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" exitCode=143 Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.190155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerDied","Data":"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673"} Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.192542 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" exitCode=143 Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.192608 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerDied","Data":"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1"} Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.194158 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerStarted","Data":"eb0248c1e16d27da1be9e878bc7100201452eaa9c36214b5074f678269d497a4"} Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.195873 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerStarted","Data":"cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba"} Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.205759 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:32 crc kubenswrapper[5094]: I0220 08:49:32.761664 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:49:32 crc kubenswrapper[5094]: W0220 08:49:32.800001 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eb2c0e1_59eb_4f7a_aeea_8965a35d861c.slice/crio-8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35 WatchSource:0}: Error finding container 8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35: Status 404 returned error can't find the container with id 8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35 Feb 20 08:49:33 crc kubenswrapper[5094]: I0220 08:49:33.205957 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerStarted","Data":"8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35"} Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.816675 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912412 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912552 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912743 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912799 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912824 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.912862 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") pod \"9ba9e313-83db-4e08-a308-376d5fdf5820\" (UID: \"9ba9e313-83db-4e08-a308-376d5fdf5820\") " Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.913381 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.914229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs" (OuterVolumeSpecName: "logs") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.919778 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts" (OuterVolumeSpecName: "scripts") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.920314 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9" (OuterVolumeSpecName: "kube-api-access-shfm9") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "kube-api-access-shfm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.920550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph" (OuterVolumeSpecName: "ceph") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.947689 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:49:34 crc kubenswrapper[5094]: I0220 08:49:34.950469 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019489 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019525 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019535 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9ba9e313-83db-4e08-a308-376d5fdf5820-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019546 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shfm9\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-kube-api-access-shfm9\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019559 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/9ba9e313-83db-4e08-a308-376d5fdf5820-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.019568 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.034968 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data" (OuterVolumeSpecName: "config-data") pod "9ba9e313-83db-4e08-a308-376d5fdf5820" (UID: "9ba9e313-83db-4e08-a308-376d5fdf5820"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.121296 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.121688 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.121809 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.121841 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.122059 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.122129 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.122213 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") pod \"0babde66-7106-44f9-8108-dc7123e64645\" (UID: \"0babde66-7106-44f9-8108-dc7123e64645\") " Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.122840 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba9e313-83db-4e08-a308-376d5fdf5820-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.127084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn" (OuterVolumeSpecName: "kube-api-access-pcxvn") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "kube-api-access-pcxvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.127828 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.127938 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs" (OuterVolumeSpecName: "logs") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.129988 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph" (OuterVolumeSpecName: "ceph") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.130288 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts" (OuterVolumeSpecName: "scripts") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.153581 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.174724 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data" (OuterVolumeSpecName: "config-data") pod "0babde66-7106-44f9-8108-dc7123e64645" (UID: "0babde66-7106-44f9-8108-dc7123e64645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.224921 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.224966 5094 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.224981 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxvn\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-kube-api-access-pcxvn\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.224993 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.225001 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0babde66-7106-44f9-8108-dc7123e64645-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.225009 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0babde66-7106-44f9-8108-dc7123e64645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.225018 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0babde66-7106-44f9-8108-dc7123e64645-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240446 5094 generic.go:334] "Generic (PLEG): container finished" podID="0babde66-7106-44f9-8108-dc7123e64645" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" exitCode=0 Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240533 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240534 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerDied","Data":"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54"} Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240598 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0babde66-7106-44f9-8108-dc7123e64645","Type":"ContainerDied","Data":"8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5"} Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.240621 5094 scope.go:117] "RemoveContainer" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.244579 5094 generic.go:334] "Generic (PLEG): container finished" podID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" exitCode=0 Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.244603 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerDied","Data":"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb"} Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.244627 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9ba9e313-83db-4e08-a308-376d5fdf5820","Type":"ContainerDied","Data":"b0e75d749acef441fc02419393d76ceab32d56e244c55548218d0246a2690c4a"} Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.244732 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.291051 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.304632 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.323345 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.343959 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: E0220 08:49:35.344412 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344427 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: E0220 08:49:35.344436 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344443 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: E0220 08:49:35.344463 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344471 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: E0220 08:49:35.344501 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344509 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344729 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344752 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344760 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" containerName="glance-httpd" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.344769 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0babde66-7106-44f9-8108-dc7123e64645" containerName="glance-log" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.345869 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.347679 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.347861 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.347986 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-7xqwp" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.381279 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.390613 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.407813 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.410391 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.412527 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.418474 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532437 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532488 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532546 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-config-data\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532625 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-scripts\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532846 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-ceph\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532915 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.532991 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gsvl\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-kube-api-access-4gsvl\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-logs\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533060 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533174 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n645w\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-kube-api-access-n645w\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533206 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533253 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.533278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634512 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-scripts\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634557 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-ceph\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634638 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gsvl\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-kube-api-access-4gsvl\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634666 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-logs\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634725 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n645w\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-kube-api-access-n645w\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634742 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634778 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634801 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634817 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-config-data\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.634870 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.635278 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.636238 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-logs\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.636252 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.636502 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-logs\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.639219 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-ceph\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.645889 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.646040 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-scripts\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.646063 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.646078 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.646120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-config-data\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.648683 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-ceph\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.650932 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.651213 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gsvl\" (UniqueName: \"kubernetes.io/projected/f67b4c32-25f3-4bc0-af69-ff9a9aa04404-kube-api-access-4gsvl\") pod \"glance-default-external-api-0\" (UID: \"f67b4c32-25f3-4bc0-af69-ff9a9aa04404\") " pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.654360 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n645w\" (UniqueName: \"kubernetes.io/projected/e9c121ca-4074-4775-a8e5-0c7f8a00ce22-kube-api-access-n645w\") pod \"glance-default-internal-api-0\" (UID: \"e9c121ca-4074-4775-a8e5-0c7f8a00ce22\") " pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.713207 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.728488 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.854432 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0babde66-7106-44f9-8108-dc7123e64645" path="/var/lib/kubelet/pods/0babde66-7106-44f9-8108-dc7123e64645/volumes" Feb 20 08:49:35 crc kubenswrapper[5094]: I0220 08:49:35.855563 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba9e313-83db-4e08-a308-376d5fdf5820" path="/var/lib/kubelet/pods/9ba9e313-83db-4e08-a308-376d5fdf5820/volumes" Feb 20 08:49:37 crc kubenswrapper[5094]: I0220 08:49:37.786091 5094 scope.go:117] "RemoveContainer" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.052765 5094 scope.go:117] "RemoveContainer" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.067371 5094 scope.go:117] "RemoveContainer" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.138277 5094 scope.go:117] "RemoveContainer" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.138644 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": container with ID starting with d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54 not found: ID does not exist" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.138729 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54"} err="failed to get container status \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": rpc error: code = NotFound desc = could not find container \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": container with ID starting with d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54 not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.138768 5094 scope.go:117] "RemoveContainer" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.139105 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673\": container with ID starting with ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673 not found: ID does not exist" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.139133 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673"} err="failed to get container status \"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673\": rpc error: code = NotFound desc = could not find container \"ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673\": container with ID starting with ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673 not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.139151 5094 scope.go:117] "RemoveContainer" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.149613 5094 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_glance-log_glance-default-external-api-0_openstack_0babde66-7106-44f9-8108-dc7123e64645_0 in pod sandbox 8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5: identifier is not a container" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.149669 5094 kuberuntime_gc.go:150] "Failed to remove container" err="rpc error: code = Unknown desc = failed to delete container k8s_glance-log_glance-default-external-api-0_openstack_0babde66-7106-44f9-8108-dc7123e64645_0 in pod sandbox 8d5bad0851a2161ffc5adb1e5293851f312f7324fa1146ff757f90b48ad072a5: identifier is not a container" containerID="ff37c929a2ebf8217ba92eb35f8313ba94223e649230e090e75079db4693a673" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.149693 5094 scope.go:117] "RemoveContainer" containerID="130895fd2e1c68f426f440f2c22e59759f2e0edcb44074b22c5181ef4b193c92" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.374233 5094 scope.go:117] "RemoveContainer" containerID="8731ce151a95e8d35ad8f8cec8f98989c881d304429dbe888614942996f9f454" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.395946 5094 scope.go:117] "RemoveContainer" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.396466 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": container with ID starting with 280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1 not found: ID does not exist" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.396510 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1"} err="failed to get container status \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": rpc error: code = NotFound desc = could not find container \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": container with ID starting with 280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1 not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.396544 5094 scope.go:117] "RemoveContainer" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.397314 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": container with ID starting with 7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb not found: ID does not exist" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.397352 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb"} err="failed to get container status \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": rpc error: code = NotFound desc = could not find container \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": container with ID starting with 7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.397378 5094 scope.go:117] "RemoveContainer" containerID="280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.397898 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1"} err="failed to get container status \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": rpc error: code = NotFound desc = could not find container \"280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1\": container with ID starting with 280e32f91521a973615fc7cec7c9c5fce98de6f601f159d43693f36acadf99e1 not found: ID does not exist" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.420171 5094 scope.go:117] "RemoveContainer" containerID="cd80f63c99e6776fe7e02e1fb80f9d75dc8541b9606921e566c011df6c0e65ac" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.442059 5094 scope.go:117] "RemoveContainer" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.442547 5094 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": rpc error: code = NotFound desc = could not find container \"7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb\": container with ID starting with 7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb not found: ID does not exist" containerID="7fc0fc0549f99042c91beb9e79acba499e9574c9f1af96cce239ad2c03d51bbb" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.442601 5094 scope.go:117] "RemoveContainer" containerID="52db5b53565602a22b540482712ac73023427fa1b0c5c5dd0a43d58c9fbc73b5" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.475859 5094 scope.go:117] "RemoveContainer" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:40 crc kubenswrapper[5094]: E0220 08:49:40.477046 5094 kuberuntime_gc.go:150] "Failed to remove container" err="failed to get container status \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": rpc error: code = NotFound desc = could not find container \"d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54\": container with ID starting with d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54 not found: ID does not exist" containerID="d79c5e8ec8398096b16b1678d54a619f6caa3a3b387c29b34f1626be7b236a54" Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.691363 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 20 08:49:40 crc kubenswrapper[5094]: W0220 08:49:40.706532 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf67b4c32_25f3_4bc0_af69_ff9a9aa04404.slice/crio-a277f4d894779a45c93b1262a35dee5043817208ec1a19240c1c267bdba44161 WatchSource:0}: Error finding container a277f4d894779a45c93b1262a35dee5043817208ec1a19240c1c267bdba44161: Status 404 returned error can't find the container with id a277f4d894779a45c93b1262a35dee5043817208ec1a19240c1c267bdba44161 Feb 20 08:49:40 crc kubenswrapper[5094]: I0220 08:49:40.793508 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 20 08:49:40 crc kubenswrapper[5094]: W0220 08:49:40.797771 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9c121ca_4074_4775_a8e5_0c7f8a00ce22.slice/crio-10b7ab67f854b392550485e5d8cc257cd94034258f7c80ddceb26f28cb3de6ae WatchSource:0}: Error finding container 10b7ab67f854b392550485e5d8cc257cd94034258f7c80ddceb26f28cb3de6ae: Status 404 returned error can't find the container with id 10b7ab67f854b392550485e5d8cc257cd94034258f7c80ddceb26f28cb3de6ae Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.347733 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9c121ca-4074-4775-a8e5-0c7f8a00ce22","Type":"ContainerStarted","Data":"10b7ab67f854b392550485e5d8cc257cd94034258f7c80ddceb26f28cb3de6ae"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.349430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f67b4c32-25f3-4bc0-af69-ff9a9aa04404","Type":"ContainerStarted","Data":"a277f4d894779a45c93b1262a35dee5043817208ec1a19240c1c267bdba44161"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.351466 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerStarted","Data":"383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.351533 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerStarted","Data":"1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.359435 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerStarted","Data":"a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.359486 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerStarted","Data":"4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.359622 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54879cd99c-v7mr5" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon-log" containerID="cri-o://4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6" gracePeriod=30 Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.359906 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54879cd99c-v7mr5" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon" containerID="cri-o://a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46" gracePeriod=30 Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.373932 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerStarted","Data":"f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.373989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerStarted","Data":"b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c"} Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.387009 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6d54b4d569-kqd4s" podStartSLOduration=3.006720379 podStartE2EDuration="10.386974937s" podCreationTimestamp="2026-02-20 08:49:31 +0000 UTC" firstStartedPulling="2026-02-20 08:49:32.802230749 +0000 UTC m=+7387.674857460" lastFinishedPulling="2026-02-20 08:49:40.182485307 +0000 UTC m=+7395.055112018" observedRunningTime="2026-02-20 08:49:41.378568844 +0000 UTC m=+7396.251195555" watchObservedRunningTime="2026-02-20 08:49:41.386974937 +0000 UTC m=+7396.259601648" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.415777 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54879cd99c-v7mr5" podStartSLOduration=2.12496127 podStartE2EDuration="10.415755009s" podCreationTimestamp="2026-02-20 08:49:31 +0000 UTC" firstStartedPulling="2026-02-20 08:49:31.932887869 +0000 UTC m=+7386.805514580" lastFinishedPulling="2026-02-20 08:49:40.223681608 +0000 UTC m=+7395.096308319" observedRunningTime="2026-02-20 08:49:41.395633786 +0000 UTC m=+7396.268260497" watchObservedRunningTime="2026-02-20 08:49:41.415755009 +0000 UTC m=+7396.288381720" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.420557 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.426108 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7cd88dbb9c-w5xqj" podStartSLOduration=2.221772728 podStartE2EDuration="10.426086177s" podCreationTimestamp="2026-02-20 08:49:31 +0000 UTC" firstStartedPulling="2026-02-20 08:49:31.974752856 +0000 UTC m=+7386.847379567" lastFinishedPulling="2026-02-20 08:49:40.179066305 +0000 UTC m=+7395.051693016" observedRunningTime="2026-02-20 08:49:41.418657139 +0000 UTC m=+7396.291283850" watchObservedRunningTime="2026-02-20 08:49:41.426086177 +0000 UTC m=+7396.298712888" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.611515 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:41 crc kubenswrapper[5094]: I0220 08:49:41.611606 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.206795 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.208098 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.404018 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9c121ca-4074-4775-a8e5-0c7f8a00ce22","Type":"ContainerStarted","Data":"ebfce51c6e65003d83ecac4575539cbe89ef33ed9ef179649160e40ffb0d9424"} Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.404075 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e9c121ca-4074-4775-a8e5-0c7f8a00ce22","Type":"ContainerStarted","Data":"851ae34f377a0b4f477b2e669be692d9a489046997a716b0cd48cdf5ebcf1c96"} Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.413393 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f67b4c32-25f3-4bc0-af69-ff9a9aa04404","Type":"ContainerStarted","Data":"5e925d2c3ff306cc61595db457c21ce5ed0a6c83a5364ed398652c314c9004df"} Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.413438 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f67b4c32-25f3-4bc0-af69-ff9a9aa04404","Type":"ContainerStarted","Data":"b7058d3a04484a92e5290bb3b23f6ae51eb873831f097864bdd8b116c4fd9b7a"} Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.436857 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.436834608 podStartE2EDuration="7.436834608s" podCreationTimestamp="2026-02-20 08:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:49:42.428384945 +0000 UTC m=+7397.301011656" watchObservedRunningTime="2026-02-20 08:49:42.436834608 +0000 UTC m=+7397.309461329" Feb 20 08:49:42 crc kubenswrapper[5094]: I0220 08:49:42.455986 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.455969889 podStartE2EDuration="7.455969889s" podCreationTimestamp="2026-02-20 08:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:49:42.45310212 +0000 UTC m=+7397.325728831" watchObservedRunningTime="2026-02-20 08:49:42.455969889 +0000 UTC m=+7397.328596600" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.715024 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.717195 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.729336 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.729414 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.757315 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.771830 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.779514 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:45 crc kubenswrapper[5094]: I0220 08:49:45.789246 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:46 crc kubenswrapper[5094]: I0220 08:49:46.456134 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 08:49:46 crc kubenswrapper[5094]: I0220 08:49:46.456212 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 20 08:49:46 crc kubenswrapper[5094]: I0220 08:49:46.456252 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:46 crc kubenswrapper[5094]: I0220 08:49:46.456263 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:48 crc kubenswrapper[5094]: I0220 08:49:48.489631 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 08:49:48 crc kubenswrapper[5094]: I0220 08:49:48.522307 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:48 crc kubenswrapper[5094]: I0220 08:49:48.528204 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 20 08:49:49 crc kubenswrapper[5094]: I0220 08:49:49.603630 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 20 08:49:51 crc kubenswrapper[5094]: I0220 08:49:51.613177 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Feb 20 08:49:52 crc kubenswrapper[5094]: I0220 08:49:52.208975 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.97:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.97:8080: connect: connection refused" Feb 20 08:50:03 crc kubenswrapper[5094]: I0220 08:50:03.507020 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:03 crc kubenswrapper[5094]: I0220 08:50:03.931674 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.264881 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.633158 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.694739 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.695297 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon-log" containerID="cri-o://b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c" gracePeriod=30 Feb 20 08:50:05 crc kubenswrapper[5094]: I0220 08:50:05.695424 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" containerID="cri-o://f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e" gracePeriod=30 Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.451261 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.453164 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.471225 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.576573 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.576690 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.576786 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.678452 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.678525 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.678574 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.679109 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.679221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.698649 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") pod \"redhat-operators-csb4t\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:06 crc kubenswrapper[5094]: I0220 08:50:06.788478 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:07 crc kubenswrapper[5094]: I0220 08:50:07.269649 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:07 crc kubenswrapper[5094]: I0220 08:50:07.672484 5094 generic.go:334] "Generic (PLEG): container finished" podID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerID="543cf7ed92ceb54ac0ce3676ff68db26877706c2eb49f4a878eeda170e2830ad" exitCode=0 Feb 20 08:50:07 crc kubenswrapper[5094]: I0220 08:50:07.672533 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerDied","Data":"543cf7ed92ceb54ac0ce3676ff68db26877706c2eb49f4a878eeda170e2830ad"} Feb 20 08:50:07 crc kubenswrapper[5094]: I0220 08:50:07.672558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerStarted","Data":"71cc0a54d9dc6f0261bfd2197724a6801fb7455d85a516d26f0e7df693b59ef1"} Feb 20 08:50:09 crc kubenswrapper[5094]: I0220 08:50:09.697881 5094 generic.go:334] "Generic (PLEG): container finished" podID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerID="503bbbb1d2206ab997b04b7307eae096f0b1864d14ed076cb276c3526d85055c" exitCode=0 Feb 20 08:50:09 crc kubenswrapper[5094]: I0220 08:50:09.697974 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerDied","Data":"503bbbb1d2206ab997b04b7307eae096f0b1864d14ed076cb276c3526d85055c"} Feb 20 08:50:09 crc kubenswrapper[5094]: I0220 08:50:09.702783 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerID="f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e" exitCode=0 Feb 20 08:50:09 crc kubenswrapper[5094]: I0220 08:50:09.702826 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerDied","Data":"f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e"} Feb 20 08:50:10 crc kubenswrapper[5094]: I0220 08:50:10.715287 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerStarted","Data":"00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23"} Feb 20 08:50:10 crc kubenswrapper[5094]: I0220 08:50:10.738427 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-csb4t" podStartSLOduration=2.198908806 podStartE2EDuration="4.738408066s" podCreationTimestamp="2026-02-20 08:50:06 +0000 UTC" firstStartedPulling="2026-02-20 08:50:07.675553208 +0000 UTC m=+7422.548179919" lastFinishedPulling="2026-02-20 08:50:10.215052458 +0000 UTC m=+7425.087679179" observedRunningTime="2026-02-20 08:50:10.733824405 +0000 UTC m=+7425.606451126" watchObservedRunningTime="2026-02-20 08:50:10.738408066 +0000 UTC m=+7425.611034777" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.612463 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725489 5094 generic.go:334] "Generic (PLEG): container finished" podID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerID="a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46" exitCode=137 Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725526 5094 generic.go:334] "Generic (PLEG): container finished" podID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerID="4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6" exitCode=137 Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725522 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerDied","Data":"a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46"} Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725566 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerDied","Data":"4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6"} Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725594 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54879cd99c-v7mr5" event={"ID":"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad","Type":"ContainerDied","Data":"eb0248c1e16d27da1be9e878bc7100201452eaa9c36214b5074f678269d497a4"} Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.725606 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb0248c1e16d27da1be9e878bc7100201452eaa9c36214b5074f678269d497a4" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.814403 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984297 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984426 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984457 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.984619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") pod \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\" (UID: \"a42f98d6-6f5b-40c1-a6af-fc44035ed4ad\") " Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.985164 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs" (OuterVolumeSpecName: "logs") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.992176 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x" (OuterVolumeSpecName: "kube-api-access-kkx4x") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "kube-api-access-kkx4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:11 crc kubenswrapper[5094]: I0220 08:50:11.993937 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.007070 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data" (OuterVolumeSpecName: "config-data") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.022602 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts" (OuterVolumeSpecName: "scripts") pod "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" (UID: "a42f98d6-6f5b-40c1-a6af-fc44035ed4ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086401 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086444 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkx4x\" (UniqueName: \"kubernetes.io/projected/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-kube-api-access-kkx4x\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086459 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086470 5094 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.086481 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.732315 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54879cd99c-v7mr5" Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.769609 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:50:12 crc kubenswrapper[5094]: I0220 08:50:12.783810 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54879cd99c-v7mr5"] Feb 20 08:50:13 crc kubenswrapper[5094]: I0220 08:50:13.849208 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" path="/var/lib/kubelet/pods/a42f98d6-6f5b-40c1-a6af-fc44035ed4ad/volumes" Feb 20 08:50:16 crc kubenswrapper[5094]: I0220 08:50:16.790747 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:16 crc kubenswrapper[5094]: I0220 08:50:16.791091 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:16 crc kubenswrapper[5094]: I0220 08:50:16.860309 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:17 crc kubenswrapper[5094]: I0220 08:50:17.810199 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:17 crc kubenswrapper[5094]: I0220 08:50:17.863678 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:19 crc kubenswrapper[5094]: I0220 08:50:19.793461 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-csb4t" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="registry-server" containerID="cri-o://00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23" gracePeriod=2 Feb 20 08:50:20 crc kubenswrapper[5094]: I0220 08:50:20.807049 5094 generic.go:334] "Generic (PLEG): container finished" podID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerID="00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23" exitCode=0 Feb 20 08:50:20 crc kubenswrapper[5094]: I0220 08:50:20.807105 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerDied","Data":"00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23"} Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.395071 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.471787 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") pod \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.471957 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") pod \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.472110 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") pod \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\" (UID: \"488d08d3-57d4-47fe-a49a-65c71e0e0c6e\") " Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.472749 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities" (OuterVolumeSpecName: "utilities") pod "488d08d3-57d4-47fe-a49a-65c71e0e0c6e" (UID: "488d08d3-57d4-47fe-a49a-65c71e0e0c6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.477653 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g" (OuterVolumeSpecName: "kube-api-access-zqz4g") pod "488d08d3-57d4-47fe-a49a-65c71e0e0c6e" (UID: "488d08d3-57d4-47fe-a49a-65c71e0e0c6e"). InnerVolumeSpecName "kube-api-access-zqz4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.574378 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqz4g\" (UniqueName: \"kubernetes.io/projected/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-kube-api-access-zqz4g\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.574408 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.612419 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.616493 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "488d08d3-57d4-47fe-a49a-65c71e0e0c6e" (UID: "488d08d3-57d4-47fe-a49a-65c71e0e0c6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.676744 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/488d08d3-57d4-47fe-a49a-65c71e0e0c6e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.821226 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-csb4t" event={"ID":"488d08d3-57d4-47fe-a49a-65c71e0e0c6e","Type":"ContainerDied","Data":"71cc0a54d9dc6f0261bfd2197724a6801fb7455d85a516d26f0e7df693b59ef1"} Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.821291 5094 scope.go:117] "RemoveContainer" containerID="00b00a71decc0a7270a2637de7cfb4d1cda3db7c0bd357bae2508a76daddba23" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.821374 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-csb4t" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.870516 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.873118 5094 scope.go:117] "RemoveContainer" containerID="503bbbb1d2206ab997b04b7307eae096f0b1864d14ed076cb276c3526d85055c" Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.880806 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-csb4t"] Feb 20 08:50:21 crc kubenswrapper[5094]: I0220 08:50:21.897312 5094 scope.go:117] "RemoveContainer" containerID="543cf7ed92ceb54ac0ce3676ff68db26877706c2eb49f4a878eeda170e2830ad" Feb 20 08:50:23 crc kubenswrapper[5094]: I0220 08:50:23.850111 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" path="/var/lib/kubelet/pods/488d08d3-57d4-47fe-a49a-65c71e0e0c6e/volumes" Feb 20 08:50:31 crc kubenswrapper[5094]: I0220 08:50:31.612255 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7cd88dbb9c-w5xqj" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.96:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.96:8080: connect: connection refused" Feb 20 08:50:31 crc kubenswrapper[5094]: I0220 08:50:31.612936 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:35 crc kubenswrapper[5094]: E0220 08:50:35.939750 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa4baf13_0870_4bf6_9a0b_d4fd1fb598ce.slice/crio-conmon-b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c.scope\": RecentStats: unable to find data in memory cache]" Feb 20 08:50:35 crc kubenswrapper[5094]: I0220 08:50:35.979303 5094 generic.go:334] "Generic (PLEG): container finished" podID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerID="b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c" exitCode=137 Feb 20 08:50:35 crc kubenswrapper[5094]: I0220 08:50:35.979348 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerDied","Data":"b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c"} Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.083119 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.191280 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.192012 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.192475 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.192761 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.193170 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") pod \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\" (UID: \"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce\") " Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.193376 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs" (OuterVolumeSpecName: "logs") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.194134 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.197096 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw" (OuterVolumeSpecName: "kube-api-access-x9cjw") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "kube-api-access-x9cjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.197184 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.214978 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts" (OuterVolumeSpecName: "scripts") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.215404 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data" (OuterVolumeSpecName: "config-data") pod "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" (UID: "aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.295787 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.296001 5094 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.296015 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9cjw\" (UniqueName: \"kubernetes.io/projected/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-kube-api-access-x9cjw\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.296027 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.995947 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7cd88dbb9c-w5xqj" event={"ID":"aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce","Type":"ContainerDied","Data":"cdb3de8f0c49a984e54c19f650258f82b6b14df1b84ec278f6a93e74f5a453ba"} Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.996041 5094 scope.go:117] "RemoveContainer" containerID="f33835eabda10dc85467686f8b097753120431960d21d398e7ef1b424eade45e" Feb 20 08:50:36 crc kubenswrapper[5094]: I0220 08:50:36.996248 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7cd88dbb9c-w5xqj" Feb 20 08:50:37 crc kubenswrapper[5094]: I0220 08:50:37.062146 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:50:37 crc kubenswrapper[5094]: I0220 08:50:37.070495 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7cd88dbb9c-w5xqj"] Feb 20 08:50:37 crc kubenswrapper[5094]: I0220 08:50:37.218859 5094 scope.go:117] "RemoveContainer" containerID="b2a6d67a4ec9b9a83a00be4b0eec89aed7fd40eb1bfd2448407868b42be4653c" Feb 20 08:50:37 crc kubenswrapper[5094]: I0220 08:50:37.856954 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" path="/var/lib/kubelet/pods/aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce/volumes" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.856033 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-85f686b8b5-kz5d4"] Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857279 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="registry-server" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857299 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="registry-server" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857321 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857329 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857342 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857351 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857367 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857375 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857393 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857400 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857429 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="extract-content" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857436 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="extract-content" Feb 20 08:50:48 crc kubenswrapper[5094]: E0220 08:50:48.857459 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="extract-utilities" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857467 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="extract-utilities" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857694 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857732 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon-log" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857747 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa4baf13-0870-4bf6-9a0b-d4fd1fb598ce" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857771 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42f98d6-6f5b-40c1-a6af-fc44035ed4ad" containerName="horizon" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.857785 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="488d08d3-57d4-47fe-a49a-65c71e0e0c6e" containerName="registry-server" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.858957 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.878627 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85f686b8b5-kz5d4"] Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967837 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd051d85-41b3-420b-9999-5c9dee9aafe3-logs\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967886 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd051d85-41b3-420b-9999-5c9dee9aafe3-horizon-secret-key\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967907 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-config-data\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967955 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-scripts\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:48 crc kubenswrapper[5094]: I0220 08:50:48.967980 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxpfb\" (UniqueName: \"kubernetes.io/projected/dd051d85-41b3-420b-9999-5c9dee9aafe3-kube-api-access-hxpfb\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.070767 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd051d85-41b3-420b-9999-5c9dee9aafe3-logs\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.070855 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd051d85-41b3-420b-9999-5c9dee9aafe3-horizon-secret-key\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.070885 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-config-data\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.071086 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-scripts\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.071747 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-scripts\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.071922 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxpfb\" (UniqueName: \"kubernetes.io/projected/dd051d85-41b3-420b-9999-5c9dee9aafe3-kube-api-access-hxpfb\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.072431 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd051d85-41b3-420b-9999-5c9dee9aafe3-config-data\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.072661 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd051d85-41b3-420b-9999-5c9dee9aafe3-logs\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.090320 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd051d85-41b3-420b-9999-5c9dee9aafe3-horizon-secret-key\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.095395 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxpfb\" (UniqueName: \"kubernetes.io/projected/dd051d85-41b3-420b-9999-5c9dee9aafe3-kube-api-access-hxpfb\") pod \"horizon-85f686b8b5-kz5d4\" (UID: \"dd051d85-41b3-420b-9999-5c9dee9aafe3\") " pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.180401 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:49 crc kubenswrapper[5094]: I0220 08:50:49.656833 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-85f686b8b5-kz5d4"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.049377 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.051346 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.071160 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.090844 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.091274 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.136039 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.137729 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85f686b8b5-kz5d4" event={"ID":"dd051d85-41b3-420b-9999-5c9dee9aafe3","Type":"ContainerStarted","Data":"aea2110b1c796c7236f073b2306c6f3a761f8d1b2287bc63c9f068d41c82bb34"} Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.137769 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85f686b8b5-kz5d4" event={"ID":"dd051d85-41b3-420b-9999-5c9dee9aafe3","Type":"ContainerStarted","Data":"3c77126b664be918f03fe1aeab230480b2cfbabc1695d0b6dd41fb92b1c61748"} Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.137784 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-85f686b8b5-kz5d4" event={"ID":"dd051d85-41b3-420b-9999-5c9dee9aafe3","Type":"ContainerStarted","Data":"8a41fe68af7c453825038647e557a4d036841eb76b42e3bc4f40c7cf5415ecbf"} Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.137880 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.139734 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.143556 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.162151 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-85f686b8b5-kz5d4" podStartSLOduration=2.162114098 podStartE2EDuration="2.162114098s" podCreationTimestamp="2026-02-20 08:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:50:50.154886973 +0000 UTC m=+7465.027513684" watchObservedRunningTime="2026-02-20 08:50:50.162114098 +0000 UTC m=+7465.034740799" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.199016 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.199286 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.199427 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.199667 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.200392 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.223816 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") pod \"heat-db-create-zdzg9\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.301739 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.301808 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.302424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.318122 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") pod \"heat-b8a1-account-create-update-27rkz\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.368741 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.455510 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.811990 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 08:50:50 crc kubenswrapper[5094]: W0220 08:50:50.813138 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1068d86d_d730_4dab_8aaf_12c5a5c62a70.slice/crio-11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4 WatchSource:0}: Error finding container 11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4: Status 404 returned error can't find the container with id 11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4 Feb 20 08:50:50 crc kubenswrapper[5094]: I0220 08:50:50.935301 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 08:50:50 crc kubenswrapper[5094]: W0220 08:50:50.940509 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod674e60ac_3253_4c4c_8e5b_7a59ed2e8989.slice/crio-677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56 WatchSource:0}: Error finding container 677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56: Status 404 returned error can't find the container with id 677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56 Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.146279 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdzg9" event={"ID":"1068d86d-d730-4dab-8aaf-12c5a5c62a70","Type":"ContainerStarted","Data":"54e699ae44cd59613f594b23e884598e72478881e5bbc4353851926cc53ca349"} Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.146587 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdzg9" event={"ID":"1068d86d-d730-4dab-8aaf-12c5a5c62a70","Type":"ContainerStarted","Data":"11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4"} Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.148853 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b8a1-account-create-update-27rkz" event={"ID":"674e60ac-3253-4c4c-8e5b-7a59ed2e8989","Type":"ContainerStarted","Data":"f3933aedef06f045c06e2f5aeaa2d2665f532f7df1042dc925032507e64641fd"} Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.148906 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b8a1-account-create-update-27rkz" event={"ID":"674e60ac-3253-4c4c-8e5b-7a59ed2e8989","Type":"ContainerStarted","Data":"677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56"} Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.172340 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-zdzg9" podStartSLOduration=1.172319784 podStartE2EDuration="1.172319784s" podCreationTimestamp="2026-02-20 08:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:50:51.160416169 +0000 UTC m=+7466.033042880" watchObservedRunningTime="2026-02-20 08:50:51.172319784 +0000 UTC m=+7466.044946495" Feb 20 08:50:51 crc kubenswrapper[5094]: I0220 08:50:51.176650 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-b8a1-account-create-update-27rkz" podStartSLOduration=1.176627308 podStartE2EDuration="1.176627308s" podCreationTimestamp="2026-02-20 08:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:50:51.172031328 +0000 UTC m=+7466.044658039" watchObservedRunningTime="2026-02-20 08:50:51.176627308 +0000 UTC m=+7466.049254019" Feb 20 08:50:52 crc kubenswrapper[5094]: I0220 08:50:52.163230 5094 generic.go:334] "Generic (PLEG): container finished" podID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" containerID="f3933aedef06f045c06e2f5aeaa2d2665f532f7df1042dc925032507e64641fd" exitCode=0 Feb 20 08:50:52 crc kubenswrapper[5094]: I0220 08:50:52.163320 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b8a1-account-create-update-27rkz" event={"ID":"674e60ac-3253-4c4c-8e5b-7a59ed2e8989","Type":"ContainerDied","Data":"f3933aedef06f045c06e2f5aeaa2d2665f532f7df1042dc925032507e64641fd"} Feb 20 08:50:52 crc kubenswrapper[5094]: I0220 08:50:52.166696 5094 generic.go:334] "Generic (PLEG): container finished" podID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" containerID="54e699ae44cd59613f594b23e884598e72478881e5bbc4353851926cc53ca349" exitCode=0 Feb 20 08:50:52 crc kubenswrapper[5094]: I0220 08:50:52.166801 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdzg9" event={"ID":"1068d86d-d730-4dab-8aaf-12c5a5c62a70","Type":"ContainerDied","Data":"54e699ae44cd59613f594b23e884598e72478881e5bbc4353851926cc53ca349"} Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.628908 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.634972 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.663772 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") pod \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.663952 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") pod \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.664114 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") pod \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\" (UID: \"674e60ac-3253-4c4c-8e5b-7a59ed2e8989\") " Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.664207 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") pod \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\" (UID: \"1068d86d-d730-4dab-8aaf-12c5a5c62a70\") " Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.667084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "674e60ac-3253-4c4c-8e5b-7a59ed2e8989" (UID: "674e60ac-3253-4c4c-8e5b-7a59ed2e8989"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.667479 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1068d86d-d730-4dab-8aaf-12c5a5c62a70" (UID: "1068d86d-d730-4dab-8aaf-12c5a5c62a70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.673233 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv" (OuterVolumeSpecName: "kube-api-access-5tbqv") pod "1068d86d-d730-4dab-8aaf-12c5a5c62a70" (UID: "1068d86d-d730-4dab-8aaf-12c5a5c62a70"). InnerVolumeSpecName "kube-api-access-5tbqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.674339 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz" (OuterVolumeSpecName: "kube-api-access-dwnfz") pod "674e60ac-3253-4c4c-8e5b-7a59ed2e8989" (UID: "674e60ac-3253-4c4c-8e5b-7a59ed2e8989"). InnerVolumeSpecName "kube-api-access-dwnfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.766542 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwnfz\" (UniqueName: \"kubernetes.io/projected/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-kube-api-access-dwnfz\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.766579 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1068d86d-d730-4dab-8aaf-12c5a5c62a70-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.766590 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tbqv\" (UniqueName: \"kubernetes.io/projected/1068d86d-d730-4dab-8aaf-12c5a5c62a70-kube-api-access-5tbqv\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:53 crc kubenswrapper[5094]: I0220 08:50:53.766599 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/674e60ac-3253-4c4c-8e5b-7a59ed2e8989-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.200303 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b8a1-account-create-update-27rkz" Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.200295 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b8a1-account-create-update-27rkz" event={"ID":"674e60ac-3253-4c4c-8e5b-7a59ed2e8989","Type":"ContainerDied","Data":"677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56"} Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.200655 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="677bf1051097abf0cb68e4029808cf02e28e5e6a02e28f101c2fe08b01aa7d56" Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.202931 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-zdzg9" event={"ID":"1068d86d-d730-4dab-8aaf-12c5a5c62a70","Type":"ContainerDied","Data":"11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4"} Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.202969 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11ad34db8d48c9d73cf719f9c6912ab2ef74a189f571f10511dc0154749afce4" Feb 20 08:50:54 crc kubenswrapper[5094]: I0220 08:50:54.203025 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-zdzg9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.327592 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 08:50:55 crc kubenswrapper[5094]: E0220 08:50:55.328402 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" containerName="mariadb-account-create-update" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.328417 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" containerName="mariadb-account-create-update" Feb 20 08:50:55 crc kubenswrapper[5094]: E0220 08:50:55.328442 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" containerName="mariadb-database-create" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.328449 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" containerName="mariadb-database-create" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.328668 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" containerName="mariadb-database-create" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.328694 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" containerName="mariadb-account-create-update" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.329557 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.336140 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.336224 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-v8n4z" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.348533 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.416847 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.417224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.417569 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.518837 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.519102 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.519244 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.524390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.524795 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.539462 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") pod \"heat-db-sync-fmrw9\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:55 crc kubenswrapper[5094]: I0220 08:50:55.654654 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fmrw9" Feb 20 08:50:56 crc kubenswrapper[5094]: I0220 08:50:56.095241 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 08:50:56 crc kubenswrapper[5094]: I0220 08:50:56.219913 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fmrw9" event={"ID":"791e2b3b-9d51-41fd-bf38-5b66849b5b77","Type":"ContainerStarted","Data":"037c8ccbfa67d05f5ae9f5822422a7180a181ab854dbbf2a204acef3cafe0f42"} Feb 20 08:50:59 crc kubenswrapper[5094]: I0220 08:50:59.180889 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:59 crc kubenswrapper[5094]: I0220 08:50:59.181938 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:50:59 crc kubenswrapper[5094]: I0220 08:50:59.184796 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-85f686b8b5-kz5d4" podUID="dd051d85-41b3-420b-9999-5c9dee9aafe3" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.101:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.101:8080: connect: connection refused" Feb 20 08:51:04 crc kubenswrapper[5094]: I0220 08:51:04.106529 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:51:04 crc kubenswrapper[5094]: I0220 08:51:04.107324 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:51:04 crc kubenswrapper[5094]: I0220 08:51:04.304151 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fmrw9" event={"ID":"791e2b3b-9d51-41fd-bf38-5b66849b5b77","Type":"ContainerStarted","Data":"da9d0c1b854591282ea3fc19512d062edc11f81e6daa7e4388a1d825d28d97b3"} Feb 20 08:51:04 crc kubenswrapper[5094]: I0220 08:51:04.325742 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-fmrw9" podStartSLOduration=1.818657999 podStartE2EDuration="9.325697267s" podCreationTimestamp="2026-02-20 08:50:55 +0000 UTC" firstStartedPulling="2026-02-20 08:50:56.102551626 +0000 UTC m=+7470.975178337" lastFinishedPulling="2026-02-20 08:51:03.609590894 +0000 UTC m=+7478.482217605" observedRunningTime="2026-02-20 08:51:04.325046052 +0000 UTC m=+7479.197672763" watchObservedRunningTime="2026-02-20 08:51:04.325697267 +0000 UTC m=+7479.198323998" Feb 20 08:51:06 crc kubenswrapper[5094]: I0220 08:51:06.327448 5094 generic.go:334] "Generic (PLEG): container finished" podID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" containerID="da9d0c1b854591282ea3fc19512d062edc11f81e6daa7e4388a1d825d28d97b3" exitCode=0 Feb 20 08:51:06 crc kubenswrapper[5094]: I0220 08:51:06.327562 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fmrw9" event={"ID":"791e2b3b-9d51-41fd-bf38-5b66849b5b77","Type":"ContainerDied","Data":"da9d0c1b854591282ea3fc19512d062edc11f81e6daa7e4388a1d825d28d97b3"} Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.680062 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fmrw9" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.753104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") pod \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.753405 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") pod \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.754177 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") pod \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\" (UID: \"791e2b3b-9d51-41fd-bf38-5b66849b5b77\") " Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.758328 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj" (OuterVolumeSpecName: "kube-api-access-522wj") pod "791e2b3b-9d51-41fd-bf38-5b66849b5b77" (UID: "791e2b3b-9d51-41fd-bf38-5b66849b5b77"). InnerVolumeSpecName "kube-api-access-522wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.791272 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "791e2b3b-9d51-41fd-bf38-5b66849b5b77" (UID: "791e2b3b-9d51-41fd-bf38-5b66849b5b77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.839078 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data" (OuterVolumeSpecName: "config-data") pod "791e2b3b-9d51-41fd-bf38-5b66849b5b77" (UID: "791e2b3b-9d51-41fd-bf38-5b66849b5b77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.861731 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.861811 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-522wj\" (UniqueName: \"kubernetes.io/projected/791e2b3b-9d51-41fd-bf38-5b66849b5b77-kube-api-access-522wj\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:07 crc kubenswrapper[5094]: I0220 08:51:07.861831 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/791e2b3b-9d51-41fd-bf38-5b66849b5b77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:08 crc kubenswrapper[5094]: I0220 08:51:08.349200 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fmrw9" event={"ID":"791e2b3b-9d51-41fd-bf38-5b66849b5b77","Type":"ContainerDied","Data":"037c8ccbfa67d05f5ae9f5822422a7180a181ab854dbbf2a204acef3cafe0f42"} Feb 20 08:51:08 crc kubenswrapper[5094]: I0220 08:51:08.349242 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="037c8ccbfa67d05f5ae9f5822422a7180a181ab854dbbf2a204acef3cafe0f42" Feb 20 08:51:08 crc kubenswrapper[5094]: I0220 08:51:08.349293 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fmrw9" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.325539 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-68d6fbc7c5-czl7r"] Feb 20 08:51:09 crc kubenswrapper[5094]: E0220 08:51:09.326216 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" containerName="heat-db-sync" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.326228 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" containerName="heat-db-sync" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.326486 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" containerName="heat-db-sync" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.327243 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.332036 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.332051 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.332288 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-v8n4z" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.359201 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-68d6fbc7c5-czl7r"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.390091 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-combined-ca-bundle\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.390354 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data-custom\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.390498 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xxl6\" (UniqueName: \"kubernetes.io/projected/f4697fe9-ee95-4003-81d9-c6d7935b46cd-kube-api-access-9xxl6\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.390681 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.492239 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.492326 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-combined-ca-bundle\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.492375 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data-custom\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.492410 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xxl6\" (UniqueName: \"kubernetes.io/projected/f4697fe9-ee95-4003-81d9-c6d7935b46cd-kube-api-access-9xxl6\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.501926 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-combined-ca-bundle\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.504001 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.506138 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f4697fe9-ee95-4003-81d9-c6d7935b46cd-config-data-custom\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.511770 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76899f657-g7f8m"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.513235 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.519064 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xxl6\" (UniqueName: \"kubernetes.io/projected/f4697fe9-ee95-4003-81d9-c6d7935b46cd-kube-api-access-9xxl6\") pod \"heat-engine-68d6fbc7c5-czl7r\" (UID: \"f4697fe9-ee95-4003-81d9-c6d7935b46cd\") " pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.519468 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.525491 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76899f657-g7f8m"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.553837 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6cc7f55d5c-lvdts"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.555141 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.560945 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.595122 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rpqn\" (UniqueName: \"kubernetes.io/projected/891348e7-69c8-46e3-a5c2-86c001574a89-kube-api-access-2rpqn\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.595165 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-combined-ca-bundle\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.595258 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data-custom\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.595289 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.606394 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6cc7f55d5c-lvdts"] Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.650139 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697048 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jlhw\" (UniqueName: \"kubernetes.io/projected/128b27b4-464a-4392-af17-51d79bdd1e1e-kube-api-access-8jlhw\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697221 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697277 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data-custom\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697345 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697373 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-combined-ca-bundle\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697406 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data-custom\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697481 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rpqn\" (UniqueName: \"kubernetes.io/projected/891348e7-69c8-46e3-a5c2-86c001574a89-kube-api-access-2rpqn\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.697508 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-combined-ca-bundle\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.702535 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-combined-ca-bundle\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.702671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.704961 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/891348e7-69c8-46e3-a5c2-86c001574a89-config-data-custom\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.720456 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rpqn\" (UniqueName: \"kubernetes.io/projected/891348e7-69c8-46e3-a5c2-86c001574a89-kube-api-access-2rpqn\") pod \"heat-cfnapi-76899f657-g7f8m\" (UID: \"891348e7-69c8-46e3-a5c2-86c001574a89\") " pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.799570 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jlhw\" (UniqueName: \"kubernetes.io/projected/128b27b4-464a-4392-af17-51d79bdd1e1e-kube-api-access-8jlhw\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.799950 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.799996 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-combined-ca-bundle\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.800022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data-custom\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.806251 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.806947 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-config-data-custom\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.814297 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/128b27b4-464a-4392-af17-51d79bdd1e1e-combined-ca-bundle\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.815562 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jlhw\" (UniqueName: \"kubernetes.io/projected/128b27b4-464a-4392-af17-51d79bdd1e1e-kube-api-access-8jlhw\") pod \"heat-api-6cc7f55d5c-lvdts\" (UID: \"128b27b4-464a-4392-af17-51d79bdd1e1e\") " pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.904619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:09 crc kubenswrapper[5094]: I0220 08:51:09.913934 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:10 crc kubenswrapper[5094]: I0220 08:51:10.153594 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-68d6fbc7c5-czl7r"] Feb 20 08:51:10 crc kubenswrapper[5094]: I0220 08:51:10.382206 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-68d6fbc7c5-czl7r" event={"ID":"f4697fe9-ee95-4003-81d9-c6d7935b46cd","Type":"ContainerStarted","Data":"794f615ce1880f7ab8bbfd5bb19e5fa1f93a61809b6001dcd8fad2d8bdeb5a3b"} Feb 20 08:51:10 crc kubenswrapper[5094]: W0220 08:51:10.576090 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod891348e7_69c8_46e3_a5c2_86c001574a89.slice/crio-e1d16a4445e10f53c28f7932d390dda2275444fc811e7d7853fb365c85fa8ccc WatchSource:0}: Error finding container e1d16a4445e10f53c28f7932d390dda2275444fc811e7d7853fb365c85fa8ccc: Status 404 returned error can't find the container with id e1d16a4445e10f53c28f7932d390dda2275444fc811e7d7853fb365c85fa8ccc Feb 20 08:51:10 crc kubenswrapper[5094]: I0220 08:51:10.585355 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76899f657-g7f8m"] Feb 20 08:51:10 crc kubenswrapper[5094]: I0220 08:51:10.642647 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6cc7f55d5c-lvdts"] Feb 20 08:51:10 crc kubenswrapper[5094]: W0220 08:51:10.644882 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod128b27b4_464a_4392_af17_51d79bdd1e1e.slice/crio-b53d024cdde88c8ef4dfde6aa81007a1b01a4de1a95899cdd6621adcbc8d65c8 WatchSource:0}: Error finding container b53d024cdde88c8ef4dfde6aa81007a1b01a4de1a95899cdd6621adcbc8d65c8: Status 404 returned error can't find the container with id b53d024cdde88c8ef4dfde6aa81007a1b01a4de1a95899cdd6621adcbc8d65c8 Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.417638 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cc7f55d5c-lvdts" event={"ID":"128b27b4-464a-4392-af17-51d79bdd1e1e","Type":"ContainerStarted","Data":"b53d024cdde88c8ef4dfde6aa81007a1b01a4de1a95899cdd6621adcbc8d65c8"} Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.421302 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-68d6fbc7c5-czl7r" event={"ID":"f4697fe9-ee95-4003-81d9-c6d7935b46cd","Type":"ContainerStarted","Data":"fc2d8d6d91d1e0d6b463340683d1d9746cd3dfead8e2fdf2bb05e8b2d2af0428"} Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.421505 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.423005 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76899f657-g7f8m" event={"ID":"891348e7-69c8-46e3-a5c2-86c001574a89","Type":"ContainerStarted","Data":"e1d16a4445e10f53c28f7932d390dda2275444fc811e7d7853fb365c85fa8ccc"} Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.444155 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-68d6fbc7c5-czl7r" podStartSLOduration=2.444136869 podStartE2EDuration="2.444136869s" podCreationTimestamp="2026-02-20 08:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:51:11.43545695 +0000 UTC m=+7486.308083661" watchObservedRunningTime="2026-02-20 08:51:11.444136869 +0000 UTC m=+7486.316763580" Feb 20 08:51:11 crc kubenswrapper[5094]: I0220 08:51:11.477904 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.434133 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6cc7f55d5c-lvdts" event={"ID":"128b27b4-464a-4392-af17-51d79bdd1e1e","Type":"ContainerStarted","Data":"215e02f645838bc960bfaa8b0aaa38d199c008cca27afa3cccc2a634176bed2e"} Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.434671 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.436620 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76899f657-g7f8m" event={"ID":"891348e7-69c8-46e3-a5c2-86c001574a89","Type":"ContainerStarted","Data":"47b3d2aa2aa8743d5d34690f2e29891b2ab75a019e1ab5624c6628a1fb916c7a"} Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.453810 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6cc7f55d5c-lvdts" podStartSLOduration=1.961544452 podStartE2EDuration="3.453788234s" podCreationTimestamp="2026-02-20 08:51:09 +0000 UTC" firstStartedPulling="2026-02-20 08:51:10.646862403 +0000 UTC m=+7485.519489114" lastFinishedPulling="2026-02-20 08:51:12.139106185 +0000 UTC m=+7487.011732896" observedRunningTime="2026-02-20 08:51:12.451859937 +0000 UTC m=+7487.324486648" watchObservedRunningTime="2026-02-20 08:51:12.453788234 +0000 UTC m=+7487.326414965" Feb 20 08:51:12 crc kubenswrapper[5094]: I0220 08:51:12.473440 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-76899f657-g7f8m" podStartSLOduration=1.915759051 podStartE2EDuration="3.473415186s" podCreationTimestamp="2026-02-20 08:51:09 +0000 UTC" firstStartedPulling="2026-02-20 08:51:10.578372046 +0000 UTC m=+7485.450998757" lastFinishedPulling="2026-02-20 08:51:12.136028181 +0000 UTC m=+7487.008654892" observedRunningTime="2026-02-20 08:51:12.467033752 +0000 UTC m=+7487.339660463" watchObservedRunningTime="2026-02-20 08:51:12.473415186 +0000 UTC m=+7487.346041897" Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.348310 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-85f686b8b5-kz5d4" Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.414569 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.414936 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" containerID="cri-o://383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d" gracePeriod=30 Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.415177 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon-log" containerID="cri-o://1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6" gracePeriod=30 Feb 20 08:51:13 crc kubenswrapper[5094]: I0220 08:51:13.452644 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:17 crc kubenswrapper[5094]: I0220 08:51:17.497171 5094 generic.go:334] "Generic (PLEG): container finished" podID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerID="383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d" exitCode=0 Feb 20 08:51:17 crc kubenswrapper[5094]: I0220 08:51:17.497292 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerDied","Data":"383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d"} Feb 20 08:51:21 crc kubenswrapper[5094]: I0220 08:51:21.285000 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6cc7f55d5c-lvdts" Feb 20 08:51:21 crc kubenswrapper[5094]: I0220 08:51:21.346585 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-76899f657-g7f8m" Feb 20 08:51:22 crc kubenswrapper[5094]: I0220 08:51:22.207081 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.97:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.97:8080: connect: connection refused" Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.049280 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.119590 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.141832 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2tqmv"] Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.163349 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-0a46-account-create-update-6bs8s"] Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.854029 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61baa8c9-4a07-47bc-94c8-b7e3f3846ff2" path="/var/lib/kubelet/pods/61baa8c9-4a07-47bc-94c8-b7e3f3846ff2/volumes" Feb 20 08:51:25 crc kubenswrapper[5094]: I0220 08:51:25.854991 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb71d5b0-a19d-4900-be92-77b1abeaf856" path="/var/lib/kubelet/pods/eb71d5b0-a19d-4900-be92-77b1abeaf856/volumes" Feb 20 08:51:29 crc kubenswrapper[5094]: I0220 08:51:29.678141 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-68d6fbc7c5-czl7r" Feb 20 08:51:32 crc kubenswrapper[5094]: I0220 08:51:32.206419 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.97:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.97:8080: connect: connection refused" Feb 20 08:51:34 crc kubenswrapper[5094]: I0220 08:51:34.107190 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:51:34 crc kubenswrapper[5094]: I0220 08:51:34.107660 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:51:36 crc kubenswrapper[5094]: I0220 08:51:36.038466 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:51:36 crc kubenswrapper[5094]: I0220 08:51:36.046691 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9jfqj"] Feb 20 08:51:37 crc kubenswrapper[5094]: I0220 08:51:37.850648 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b241ede-085a-44b3-857b-f64e36b7b14f" path="/var/lib/kubelet/pods/7b241ede-085a-44b3-857b-f64e36b7b14f/volumes" Feb 20 08:51:40 crc kubenswrapper[5094]: I0220 08:51:40.694954 5094 scope.go:117] "RemoveContainer" containerID="4231927e6f52319c4c7cbbaa5766e18430942afbbae151ea27a85c1b2eed2b12" Feb 20 08:51:40 crc kubenswrapper[5094]: I0220 08:51:40.725006 5094 scope.go:117] "RemoveContainer" containerID="ba2973a00772608356b5c6835b97549aca8d7d662bffa9fb35da2b972f2aa0f4" Feb 20 08:51:40 crc kubenswrapper[5094]: I0220 08:51:40.795543 5094 scope.go:117] "RemoveContainer" containerID="4f87cc562d40739a0734989e8f19246c6cf1e1144b307f5249bd8e950afcfbb0" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.250771 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz"] Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.253850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.268084 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz"] Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.302875 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.405801 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.405930 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.406126 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.508088 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.508245 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.508817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.508817 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.509132 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.528693 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:41 crc kubenswrapper[5094]: I0220 08:51:41.650950 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.170642 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz"] Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.207914 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-6d54b4d569-kqd4s" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.97:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.97:8080: connect: connection refused" Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.208065 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.740519 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerStarted","Data":"3cb356662f4541cb557f9f84de510f207980149256804e45c03e701292ad6e86"} Feb 20 08:51:42 crc kubenswrapper[5094]: I0220 08:51:42.740868 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerStarted","Data":"2fda3078b24f95f337aee86c4263f656c6c005cd02148d7411e16695a2ec7a86"} Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.751097 5094 generic.go:334] "Generic (PLEG): container finished" podID="77007c08-6c58-4a19-9c49-09c1677b9070" containerID="3cb356662f4541cb557f9f84de510f207980149256804e45c03e701292ad6e86" exitCode=0 Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.751249 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerDied","Data":"3cb356662f4541cb557f9f84de510f207980149256804e45c03e701292ad6e86"} Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.757279 5094 generic.go:334] "Generic (PLEG): container finished" podID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerID="1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6" exitCode=137 Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.757316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerDied","Data":"1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6"} Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.886205 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952107 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952158 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952208 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952315 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.952381 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") pod \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\" (UID: \"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c\") " Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.953229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs" (OuterVolumeSpecName: "logs") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.958025 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.958128 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz" (OuterVolumeSpecName: "kube-api-access-2rkqz") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "kube-api-access-2rkqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.981333 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts" (OuterVolumeSpecName: "scripts") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:51:43 crc kubenswrapper[5094]: I0220 08:51:43.985807 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data" (OuterVolumeSpecName: "config-data") pod "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" (UID: "8eb2c0e1-59eb-4f7a-aeea-8965a35d861c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054232 5094 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054487 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054495 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054504 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rkqz\" (UniqueName: \"kubernetes.io/projected/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-kube-api-access-2rkqz\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.054514 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c-logs\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.773815 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6d54b4d569-kqd4s" event={"ID":"8eb2c0e1-59eb-4f7a-aeea-8965a35d861c","Type":"ContainerDied","Data":"8a9bb3e82b6fd52dcf3df5d577a488c8f0daee979b4bcefb84c464a576031f35"} Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.773907 5094 scope.go:117] "RemoveContainer" containerID="383a27955fb0f86aaf1453d917dac5350556fafc41a927eb7c86a2bb5a520c4d" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.773946 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6d54b4d569-kqd4s" Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.834951 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:51:44 crc kubenswrapper[5094]: I0220 08:51:44.851543 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6d54b4d569-kqd4s"] Feb 20 08:51:45 crc kubenswrapper[5094]: I0220 08:51:45.021268 5094 scope.go:117] "RemoveContainer" containerID="1d9312f8455277c3cb35f55a1583969cb70ad9d4c52d34eaec9448990ee953f6" Feb 20 08:51:45 crc kubenswrapper[5094]: I0220 08:51:45.786608 5094 generic.go:334] "Generic (PLEG): container finished" podID="77007c08-6c58-4a19-9c49-09c1677b9070" containerID="64c504c7c1cea457bbff3398c963e65cd43b025824a9b551288752d71fb56546" exitCode=0 Feb 20 08:51:45 crc kubenswrapper[5094]: I0220 08:51:45.786646 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerDied","Data":"64c504c7c1cea457bbff3398c963e65cd43b025824a9b551288752d71fb56546"} Feb 20 08:51:45 crc kubenswrapper[5094]: I0220 08:51:45.861354 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" path="/var/lib/kubelet/pods/8eb2c0e1-59eb-4f7a-aeea-8965a35d861c/volumes" Feb 20 08:51:46 crc kubenswrapper[5094]: I0220 08:51:46.803176 5094 generic.go:334] "Generic (PLEG): container finished" podID="77007c08-6c58-4a19-9c49-09c1677b9070" containerID="844a0746cb4596152109695e95d4e6d419894f770e1fe1f8e26fe5fe78f38751" exitCode=0 Feb 20 08:51:46 crc kubenswrapper[5094]: I0220 08:51:46.803303 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerDied","Data":"844a0746cb4596152109695e95d4e6d419894f770e1fe1f8e26fe5fe78f38751"} Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.232032 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.346345 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") pod \"77007c08-6c58-4a19-9c49-09c1677b9070\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.346802 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") pod \"77007c08-6c58-4a19-9c49-09c1677b9070\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.346845 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") pod \"77007c08-6c58-4a19-9c49-09c1677b9070\" (UID: \"77007c08-6c58-4a19-9c49-09c1677b9070\") " Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.348790 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle" (OuterVolumeSpecName: "bundle") pod "77007c08-6c58-4a19-9c49-09c1677b9070" (UID: "77007c08-6c58-4a19-9c49-09c1677b9070"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.356076 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5" (OuterVolumeSpecName: "kube-api-access-7n7j5") pod "77007c08-6c58-4a19-9c49-09c1677b9070" (UID: "77007c08-6c58-4a19-9c49-09c1677b9070"). InnerVolumeSpecName "kube-api-access-7n7j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.359891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util" (OuterVolumeSpecName: "util") pod "77007c08-6c58-4a19-9c49-09c1677b9070" (UID: "77007c08-6c58-4a19-9c49-09c1677b9070"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.449181 5094 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.449220 5094 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/77007c08-6c58-4a19-9c49-09c1677b9070-util\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.449233 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n7j5\" (UniqueName: \"kubernetes.io/projected/77007c08-6c58-4a19-9c49-09c1677b9070-kube-api-access-7n7j5\") on node \"crc\" DevicePath \"\"" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.824977 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" event={"ID":"77007c08-6c58-4a19-9c49-09c1677b9070","Type":"ContainerDied","Data":"2fda3078b24f95f337aee86c4263f656c6c005cd02148d7411e16695a2ec7a86"} Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.825020 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fda3078b24f95f337aee86c4263f656c6c005cd02148d7411e16695a2ec7a86" Feb 20 08:51:48 crc kubenswrapper[5094]: I0220 08:51:48.825052 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz" Feb 20 08:51:56 crc kubenswrapper[5094]: I0220 08:51:56.047893 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:51:56 crc kubenswrapper[5094]: I0220 08:51:56.059936 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:51:56 crc kubenswrapper[5094]: I0220 08:51:56.076180 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-rbwv2"] Feb 20 08:51:56 crc kubenswrapper[5094]: I0220 08:51:56.090132 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fc68-account-create-update-ctxrq"] Feb 20 08:51:57 crc kubenswrapper[5094]: I0220 08:51:57.851999 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3a4acd-5b68-467c-b024-b518d0f4d27e" path="/var/lib/kubelet/pods/0f3a4acd-5b68-467c-b024-b518d0f4d27e/volumes" Feb 20 08:51:57 crc kubenswrapper[5094]: I0220 08:51:57.853184 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c2373d-6a69-460a-8622-d001dc53efc0" path="/var/lib/kubelet/pods/d8c2373d-6a69-460a-8622-d001dc53efc0/volumes" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829224 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="extract" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829670 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="extract" Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829684 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="util" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829691 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="util" Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829711 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="pull" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829717 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="pull" Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829738 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon-log" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829744 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon-log" Feb 20 08:51:58 crc kubenswrapper[5094]: E0220 08:51:58.829753 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829759 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829950 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="77007c08-6c58-4a19-9c49-09c1677b9070" containerName="extract" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829965 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon-log" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.829977 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb2c0e1-59eb-4f7a-aeea-8965a35d861c" containerName="horizon" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.831224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.844899 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.980071 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.980502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:58 crc kubenswrapper[5094]: I0220 08:51:58.980549 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.082946 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.083065 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.083119 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.084239 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.084525 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.118031 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") pod \"certified-operators-nx84l\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.155697 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.348815 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.350759 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.355612 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-6fs65" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.355888 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.356001 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.358698 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.403770 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mzr\" (UniqueName: \"kubernetes.io/projected/5a9736b1-aca8-4880-9d94-2d7c37efce50-kube-api-access-g9mzr\") pod \"obo-prometheus-operator-68bc856cb9-kxcc8\" (UID: \"5a9736b1-aca8-4880-9d94-2d7c37efce50\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.505402 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mzr\" (UniqueName: \"kubernetes.io/projected/5a9736b1-aca8-4880-9d94-2d7c37efce50-kube-api-access-g9mzr\") pod \"obo-prometheus-operator-68bc856cb9-kxcc8\" (UID: \"5a9736b1-aca8-4880-9d94-2d7c37efce50\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.559094 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.560749 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.574877 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.576076 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.577110 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mzr\" (UniqueName: \"kubernetes.io/projected/5a9736b1-aca8-4880-9d94-2d7c37efce50-kube-api-access-g9mzr\") pod \"obo-prometheus-operator-68bc856cb9-kxcc8\" (UID: \"5a9736b1-aca8-4880-9d94-2d7c37efce50\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.587085 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qvgrr" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.587399 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.606657 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.606789 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.606834 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.606863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.629155 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.683264 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.707554 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.707661 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.707730 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.707781 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.717536 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.728243 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.729816 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.733221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c70d95ea-5321-43fa-8df8-6d1138f0a732-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-d7v85\" (UID: \"c70d95ea-5321-43fa-8df8-6d1138f0a732\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.733597 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b0fb9831-f265-4976-9a1d-14ed3e08daf5-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-56764c7d84-47vjf\" (UID: \"b0fb9831-f265-4976-9a1d-14ed3e08daf5\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.792451 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-dcm8l"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.794113 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.797976 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-jnhxq" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.798151 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.813584 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/724c1050-e6d7-49c3-8b63-a89a3de26894-observability-operator-tls\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.813661 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7qzw\" (UniqueName: \"kubernetes.io/projected/724c1050-e6d7-49c3-8b63-a89a3de26894-kube-api-access-p7qzw\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.829846 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-dcm8l"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.902195 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-9hzgx"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.903571 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.908110 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-p4tqg" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.910553 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-9hzgx"] Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.916236 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vf9p\" (UniqueName: \"kubernetes.io/projected/1ab531ae-b53c-4de1-b927-ca32c159c244-kube-api-access-8vf9p\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.916361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/724c1050-e6d7-49c3-8b63-a89a3de26894-observability-operator-tls\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.916404 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7qzw\" (UniqueName: \"kubernetes.io/projected/724c1050-e6d7-49c3-8b63-a89a3de26894-kube-api-access-p7qzw\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.916457 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ab531ae-b53c-4de1-b927-ca32c159c244-openshift-service-ca\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.920374 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/724c1050-e6d7-49c3-8b63-a89a3de26894-observability-operator-tls\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.933385 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.949123 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7qzw\" (UniqueName: \"kubernetes.io/projected/724c1050-e6d7-49c3-8b63-a89a3de26894-kube-api-access-p7qzw\") pod \"observability-operator-59bdc8b94-dcm8l\" (UID: \"724c1050-e6d7-49c3-8b63-a89a3de26894\") " pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:51:59 crc kubenswrapper[5094]: I0220 08:51:59.967368 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.019427 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ab531ae-b53c-4de1-b927-ca32c159c244-openshift-service-ca\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.019541 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vf9p\" (UniqueName: \"kubernetes.io/projected/1ab531ae-b53c-4de1-b927-ca32c159c244-kube-api-access-8vf9p\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.020750 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ab531ae-b53c-4de1-b927-ca32c159c244-openshift-service-ca\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.048649 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vf9p\" (UniqueName: \"kubernetes.io/projected/1ab531ae-b53c-4de1-b927-ca32c159c244-kube-api-access-8vf9p\") pod \"perses-operator-5bf474d74f-9hzgx\" (UID: \"1ab531ae-b53c-4de1-b927-ca32c159c244\") " pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.074196 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.132313 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.250146 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.430468 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8"] Feb 20 08:52:00 crc kubenswrapper[5094]: W0220 08:52:00.456313 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9736b1_aca8_4880_9d94_2d7c37efce50.slice/crio-0425fc7f6b5bdc103dfedc7ccddf9bf765c34c44f2b4b044230b9c903ac7a82e WatchSource:0}: Error finding container 0425fc7f6b5bdc103dfedc7ccddf9bf765c34c44f2b4b044230b9c903ac7a82e: Status 404 returned error can't find the container with id 0425fc7f6b5bdc103dfedc7ccddf9bf765c34c44f2b4b044230b9c903ac7a82e Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.794608 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85"] Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.860494 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf"] Feb 20 08:52:00 crc kubenswrapper[5094]: I0220 08:52:00.954370 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-dcm8l"] Feb 20 08:52:00 crc kubenswrapper[5094]: W0220 08:52:00.968452 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod724c1050_e6d7_49c3_8b63_a89a3de26894.slice/crio-5af3d6f292cfabc9ca13a35792ca5350ce7027d9b7ba0dc7d8892d027791fc4c WatchSource:0}: Error finding container 5af3d6f292cfabc9ca13a35792ca5350ce7027d9b7ba0dc7d8892d027791fc4c: Status 404 returned error can't find the container with id 5af3d6f292cfabc9ca13a35792ca5350ce7027d9b7ba0dc7d8892d027791fc4c Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.010504 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" event={"ID":"5a9736b1-aca8-4880-9d94-2d7c37efce50","Type":"ContainerStarted","Data":"0425fc7f6b5bdc103dfedc7ccddf9bf765c34c44f2b4b044230b9c903ac7a82e"} Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.016085 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" event={"ID":"c70d95ea-5321-43fa-8df8-6d1138f0a732","Type":"ContainerStarted","Data":"054684d1c1fc5b91f1db72fb93c700a235c03254e2cd71a609d2b1a8cace63d7"} Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.024653 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-9hzgx"] Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.026060 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" event={"ID":"b0fb9831-f265-4976-9a1d-14ed3e08daf5","Type":"ContainerStarted","Data":"5b7fa806041218d240f6e687f2f2ae05330fa4bf1ad31b3f4d2f760ab8e97838"} Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.031409 5094 generic.go:334] "Generic (PLEG): container finished" podID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerID="7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2" exitCode=0 Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.031461 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerDied","Data":"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2"} Feb 20 08:52:01 crc kubenswrapper[5094]: I0220 08:52:01.031493 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerStarted","Data":"1df1d0ed97e3ac7409ed3dcedbc351adad4489c377b93acbbc87e23e667f53ac"} Feb 20 08:52:02 crc kubenswrapper[5094]: I0220 08:52:02.066830 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" event={"ID":"724c1050-e6d7-49c3-8b63-a89a3de26894","Type":"ContainerStarted","Data":"5af3d6f292cfabc9ca13a35792ca5350ce7027d9b7ba0dc7d8892d027791fc4c"} Feb 20 08:52:02 crc kubenswrapper[5094]: I0220 08:52:02.075879 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerStarted","Data":"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8"} Feb 20 08:52:02 crc kubenswrapper[5094]: I0220 08:52:02.085130 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" event={"ID":"1ab531ae-b53c-4de1-b927-ca32c159c244","Type":"ContainerStarted","Data":"9d5eba9e678dc50f7e4465f743bd52d6d4a633dcd87ade03b99eb12c677c9d5f"} Feb 20 08:52:03 crc kubenswrapper[5094]: I0220 08:52:03.123260 5094 generic.go:334] "Generic (PLEG): container finished" podID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerID="af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8" exitCode=0 Feb 20 08:52:03 crc kubenswrapper[5094]: I0220 08:52:03.123315 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerDied","Data":"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8"} Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.106344 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.106808 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.106848 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.107570 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.107622 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268" gracePeriod=600 Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.143952 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerStarted","Data":"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972"} Feb 20 08:52:04 crc kubenswrapper[5094]: I0220 08:52:04.168982 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nx84l" podStartSLOduration=3.572531461 podStartE2EDuration="6.168963911s" podCreationTimestamp="2026-02-20 08:51:58 +0000 UTC" firstStartedPulling="2026-02-20 08:52:01.032858611 +0000 UTC m=+7535.905485322" lastFinishedPulling="2026-02-20 08:52:03.629291061 +0000 UTC m=+7538.501917772" observedRunningTime="2026-02-20 08:52:04.161659485 +0000 UTC m=+7539.034286196" watchObservedRunningTime="2026-02-20 08:52:04.168963911 +0000 UTC m=+7539.041590612" Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.037263 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.049875 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-j7lxk"] Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.268461 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268" exitCode=0 Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.268964 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268"} Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.269022 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c"} Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.269046 5094 scope.go:117] "RemoveContainer" containerID="f0c83f098c8b80a5dca3990c77dcb28ca86dc20cfa265b37bc9d16d6c922f7ec" Feb 20 08:52:05 crc kubenswrapper[5094]: I0220 08:52:05.862032 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcbd09e1-8a1b-468e-9238-0691cafda43e" path="/var/lib/kubelet/pods/bcbd09e1-8a1b-468e-9238-0691cafda43e/volumes" Feb 20 08:52:09 crc kubenswrapper[5094]: I0220 08:52:09.157021 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:09 crc kubenswrapper[5094]: I0220 08:52:09.157472 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:10 crc kubenswrapper[5094]: I0220 08:52:10.204960 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nx84l" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" probeResult="failure" output=< Feb 20 08:52:10 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:52:10 crc kubenswrapper[5094]: > Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.439627 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" event={"ID":"1ab531ae-b53c-4de1-b927-ca32c159c244","Type":"ContainerStarted","Data":"a6aeeebe3452765baf4fc4df8f2da74b6f5cfde3a173f15eb09905e67cdca1ea"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.440532 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.442810 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" event={"ID":"c70d95ea-5321-43fa-8df8-6d1138f0a732","Type":"ContainerStarted","Data":"d6ae85af8c0ad22bb7579b710f16c4e76e31031ca21f693d7ff02d09c3d3c194"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.445260 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" event={"ID":"724c1050-e6d7-49c3-8b63-a89a3de26894","Type":"ContainerStarted","Data":"e5816d6a41fbf36dfd4eecd83bdc52eb0b23e5586239f148b01393e04c99d76d"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.445535 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.448321 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.449057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" event={"ID":"b0fb9831-f265-4976-9a1d-14ed3e08daf5","Type":"ContainerStarted","Data":"16777c714efd6d0a3b065ff3b29685a6fc83e7a9d28335e6c63bcd1571c3c03e"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.452037 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" event={"ID":"5a9736b1-aca8-4880-9d94-2d7c37efce50","Type":"ContainerStarted","Data":"95b38ec28736aee248de3d23f584738f59fdb508c5b2fd50abe3103583bfa3f8"} Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.498925 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" podStartSLOduration=3.30291945 podStartE2EDuration="18.498908332s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:01.026011358 +0000 UTC m=+7535.898638069" lastFinishedPulling="2026-02-20 08:52:16.22200024 +0000 UTC m=+7551.094626951" observedRunningTime="2026-02-20 08:52:17.493882331 +0000 UTC m=+7552.366509042" watchObservedRunningTime="2026-02-20 08:52:17.498908332 +0000 UTC m=+7552.371535043" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.530376 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-dcm8l" podStartSLOduration=3.379313426 podStartE2EDuration="18.530357218s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:01.018905966 +0000 UTC m=+7535.891532667" lastFinishedPulling="2026-02-20 08:52:16.169949748 +0000 UTC m=+7551.042576459" observedRunningTime="2026-02-20 08:52:17.52211539 +0000 UTC m=+7552.394742101" watchObservedRunningTime="2026-02-20 08:52:17.530357218 +0000 UTC m=+7552.402983929" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.548357 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-d7v85" podStartSLOduration=3.195497476 podStartE2EDuration="18.548340161s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:00.802921041 +0000 UTC m=+7535.675547752" lastFinishedPulling="2026-02-20 08:52:16.155763726 +0000 UTC m=+7551.028390437" observedRunningTime="2026-02-20 08:52:17.545229496 +0000 UTC m=+7552.417856207" watchObservedRunningTime="2026-02-20 08:52:17.548340161 +0000 UTC m=+7552.420966872" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.645176 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-kxcc8" podStartSLOduration=2.933400952 podStartE2EDuration="18.645145069s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:00.45902244 +0000 UTC m=+7535.331649151" lastFinishedPulling="2026-02-20 08:52:16.170766557 +0000 UTC m=+7551.043393268" observedRunningTime="2026-02-20 08:52:17.60859234 +0000 UTC m=+7552.481219051" watchObservedRunningTime="2026-02-20 08:52:17.645145069 +0000 UTC m=+7552.517771780" Feb 20 08:52:17 crc kubenswrapper[5094]: I0220 08:52:17.681425 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-56764c7d84-47vjf" podStartSLOduration=3.413205961 podStartE2EDuration="18.681401821s" podCreationTimestamp="2026-02-20 08:51:59 +0000 UTC" firstStartedPulling="2026-02-20 08:52:00.86485021 +0000 UTC m=+7535.737476911" lastFinishedPulling="2026-02-20 08:52:16.13304606 +0000 UTC m=+7551.005672771" observedRunningTime="2026-02-20 08:52:17.639462042 +0000 UTC m=+7552.512088753" watchObservedRunningTime="2026-02-20 08:52:17.681401821 +0000 UTC m=+7552.554028532" Feb 20 08:52:20 crc kubenswrapper[5094]: I0220 08:52:20.203436 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-nx84l" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" probeResult="failure" output=< Feb 20 08:52:20 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:52:20 crc kubenswrapper[5094]: > Feb 20 08:52:29 crc kubenswrapper[5094]: I0220 08:52:29.226283 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:29 crc kubenswrapper[5094]: I0220 08:52:29.282082 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:30 crc kubenswrapper[5094]: I0220 08:52:30.034951 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:52:30 crc kubenswrapper[5094]: I0220 08:52:30.253637 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-9hzgx" Feb 20 08:52:30 crc kubenswrapper[5094]: I0220 08:52:30.560452 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nx84l" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" containerID="cri-o://63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" gracePeriod=2 Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.229977 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.388634 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") pod \"0394b5a4-125e-479d-b699-d9bd69bf812f\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.388689 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") pod \"0394b5a4-125e-479d-b699-d9bd69bf812f\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.388740 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") pod \"0394b5a4-125e-479d-b699-d9bd69bf812f\" (UID: \"0394b5a4-125e-479d-b699-d9bd69bf812f\") " Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.389339 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities" (OuterVolumeSpecName: "utilities") pod "0394b5a4-125e-479d-b699-d9bd69bf812f" (UID: "0394b5a4-125e-479d-b699-d9bd69bf812f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.395435 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt" (OuterVolumeSpecName: "kube-api-access-fxwtt") pod "0394b5a4-125e-479d-b699-d9bd69bf812f" (UID: "0394b5a4-125e-479d-b699-d9bd69bf812f"). InnerVolumeSpecName "kube-api-access-fxwtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.439225 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0394b5a4-125e-479d-b699-d9bd69bf812f" (UID: "0394b5a4-125e-479d-b699-d9bd69bf812f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.491860 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxwtt\" (UniqueName: \"kubernetes.io/projected/0394b5a4-125e-479d-b699-d9bd69bf812f-kube-api-access-fxwtt\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.491902 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.491916 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0394b5a4-125e-479d-b699-d9bd69bf812f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569571 5094 generic.go:334] "Generic (PLEG): container finished" podID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerID="63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" exitCode=0 Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569616 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerDied","Data":"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972"} Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569645 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nx84l" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569669 5094 scope.go:117] "RemoveContainer" containerID="63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.569657 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nx84l" event={"ID":"0394b5a4-125e-479d-b699-d9bd69bf812f","Type":"ContainerDied","Data":"1df1d0ed97e3ac7409ed3dcedbc351adad4489c377b93acbbc87e23e667f53ac"} Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.615966 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.619143 5094 scope.go:117] "RemoveContainer" containerID="af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.630858 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nx84l"] Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.652881 5094 scope.go:117] "RemoveContainer" containerID="7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.704456 5094 scope.go:117] "RemoveContainer" containerID="63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" Feb 20 08:52:31 crc kubenswrapper[5094]: E0220 08:52:31.705040 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972\": container with ID starting with 63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972 not found: ID does not exist" containerID="63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705089 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972"} err="failed to get container status \"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972\": rpc error: code = NotFound desc = could not find container \"63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972\": container with ID starting with 63c8ab29f73793fd740bde02158cc28b743c87e1cd91233ed13edda13a34d972 not found: ID does not exist" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705110 5094 scope.go:117] "RemoveContainer" containerID="af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8" Feb 20 08:52:31 crc kubenswrapper[5094]: E0220 08:52:31.705526 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8\": container with ID starting with af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8 not found: ID does not exist" containerID="af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705567 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8"} err="failed to get container status \"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8\": rpc error: code = NotFound desc = could not find container \"af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8\": container with ID starting with af131ecba7c2b936069eb198796d5d9b2fc1ea0d244e549b881018f904406df8 not found: ID does not exist" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705591 5094 scope.go:117] "RemoveContainer" containerID="7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2" Feb 20 08:52:31 crc kubenswrapper[5094]: E0220 08:52:31.705894 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2\": container with ID starting with 7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2 not found: ID does not exist" containerID="7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.705940 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2"} err="failed to get container status \"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2\": rpc error: code = NotFound desc = could not find container \"7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2\": container with ID starting with 7efb0ca9f17319c818fe9f84b6b2ebce31cd83d827cfa03850f26ab199e47cc2 not found: ID does not exist" Feb 20 08:52:31 crc kubenswrapper[5094]: I0220 08:52:31.854973 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" path="/var/lib/kubelet/pods/0394b5a4-125e-479d-b699-d9bd69bf812f/volumes" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.864481 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.865836 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerName="openstackclient" containerID="cri-o://cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" gracePeriod=2 Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.872660 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.941689 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:32 crc kubenswrapper[5094]: E0220 08:52:32.942478 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.942552 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" Feb 20 08:52:32 crc kubenswrapper[5094]: E0220 08:52:32.942633 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="extract-content" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.942689 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="extract-content" Feb 20 08:52:32 crc kubenswrapper[5094]: E0220 08:52:32.942778 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerName="openstackclient" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.942953 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerName="openstackclient" Feb 20 08:52:32 crc kubenswrapper[5094]: E0220 08:52:32.943020 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="extract-utilities" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.943078 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="extract-utilities" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.943312 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerName="openstackclient" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.943397 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0394b5a4-125e-479d-b699-d9bd69bf812f" containerName="registry-server" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.944071 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.963363 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:32 crc kubenswrapper[5094]: I0220 08:52:32.978949 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" podUID="3c21f8d0-ca22-4206-9cdf-26edee70eac2" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.120947 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.120993 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.121118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pncd7\" (UniqueName: \"kubernetes.io/projected/3c21f8d0-ca22-4206-9cdf-26edee70eac2-kube-api-access-pncd7\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.124544 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.125838 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.137014 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-ck746" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.141995 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.224022 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.224072 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.224169 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pncd7\" (UniqueName: \"kubernetes.io/projected/3c21f8d0-ca22-4206-9cdf-26edee70eac2-kube-api-access-pncd7\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.228356 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.250427 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3c21f8d0-ca22-4206-9cdf-26edee70eac2-openstack-config-secret\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.271134 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pncd7\" (UniqueName: \"kubernetes.io/projected/3c21f8d0-ca22-4206-9cdf-26edee70eac2-kube-api-access-pncd7\") pod \"openstackclient\" (UID: \"3c21f8d0-ca22-4206-9cdf-26edee70eac2\") " pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.326168 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrt96\" (UniqueName: \"kubernetes.io/projected/640e24e6-f89c-45ee-999a-e5aa0816aab2-kube-api-access-hrt96\") pod \"kube-state-metrics-0\" (UID: \"640e24e6-f89c-45ee-999a-e5aa0816aab2\") " pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.428084 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrt96\" (UniqueName: \"kubernetes.io/projected/640e24e6-f89c-45ee-999a-e5aa0816aab2-kube-api-access-hrt96\") pod \"kube-state-metrics-0\" (UID: \"640e24e6-f89c-45ee-999a-e5aa0816aab2\") " pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.451620 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrt96\" (UniqueName: \"kubernetes.io/projected/640e24e6-f89c-45ee-999a-e5aa0816aab2-kube-api-access-hrt96\") pod \"kube-state-metrics-0\" (UID: \"640e24e6-f89c-45ee-999a-e5aa0816aab2\") " pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.562586 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.749753 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.868865 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.872451 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.875682 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.876203 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-wx2qf" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.876391 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.876626 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.876416 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.885450 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968496 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968641 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968732 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968814 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968916 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkbk4\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-kube-api-access-bkbk4\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.968960 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:33 crc kubenswrapper[5094]: I0220 08:52:33.969001 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.070956 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071037 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071099 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071125 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071165 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071202 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkbk4\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-kube-api-access-bkbk4\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.071224 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.072029 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.081512 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.082911 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.090240 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.090675 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.091415 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c09fbf6b-1221-4e3d-b29d-6432848a564b-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.106719 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkbk4\" (UniqueName: \"kubernetes.io/projected/c09fbf6b-1221-4e3d-b29d-6432848a564b-kube-api-access-bkbk4\") pod \"alertmanager-metric-storage-0\" (UID: \"c09fbf6b-1221-4e3d-b29d-6432848a564b\") " pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.245793 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.371501 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.488409 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.492137 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522172 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522684 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522795 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522833 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522221 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522861 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.522984 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.523009 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-65nkr" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.560901 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.604780 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.674402 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"640e24e6-f89c-45ee-999a-e5aa0816aab2","Type":"ContainerStarted","Data":"1608dc353a7cc7eecd85ae0f251fb9d8d8e618c90e96fbbc4fdc692c3ab5e942"} Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.683382 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3c21f8d0-ca22-4206-9cdf-26edee70eac2","Type":"ContainerStarted","Data":"7efd800e3876958580b04f199acc7c1bd9fc79868ef56b29f76534fff745b5a0"} Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689664 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfmlj\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-kube-api-access-xfmlj\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689723 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689804 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689839 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689859 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689908 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22516d8a-bb80-405e-8258-01fd733495ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689946 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.689973 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.690000 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7adbca82-75fc-4744-beaf-74d298170209\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7adbca82-75fc-4744-beaf-74d298170209\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792525 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792589 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7adbca82-75fc-4744-beaf-74d298170209\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7adbca82-75fc-4744-beaf-74d298170209\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792664 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfmlj\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-kube-api-access-xfmlj\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792735 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792844 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.792894 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22516d8a-bb80-405e-8258-01fd733495ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.796823 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.797435 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.798727 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/22516d8a-bb80-405e-8258-01fd733495ef-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.804117 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/22516d8a-bb80-405e-8258-01fd733495ef-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.804939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.809491 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.823366 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.832210 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/22516d8a-bb80-405e-8258-01fd733495ef-config\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.851647 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfmlj\" (UniqueName: \"kubernetes.io/projected/22516d8a-bb80-405e-8258-01fd733495ef-kube-api-access-xfmlj\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.861329 5094 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 20 08:52:34 crc kubenswrapper[5094]: I0220 08:52:34.861387 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7adbca82-75fc-4744-beaf-74d298170209\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7adbca82-75fc-4744-beaf-74d298170209\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2a5e5c320d42bf051c2648299b41e130990427635def81e3e854b40dad0c11aa/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.010368 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.022842 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7adbca82-75fc-4744-beaf-74d298170209\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7adbca82-75fc-4744-beaf-74d298170209\") pod \"prometheus-metric-storage-0\" (UID: \"22516d8a-bb80-405e-8258-01fd733495ef\") " pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.152921 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.486823 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.610925 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") pod \"b4afe958-0e78-49e9-b05a-08ff4c42f602\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.611104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") pod \"b4afe958-0e78-49e9-b05a-08ff4c42f602\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.611248 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") pod \"b4afe958-0e78-49e9-b05a-08ff4c42f602\" (UID: \"b4afe958-0e78-49e9-b05a-08ff4c42f602\") " Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.631370 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c" (OuterVolumeSpecName: "kube-api-access-kd86c") pod "b4afe958-0e78-49e9-b05a-08ff4c42f602" (UID: "b4afe958-0e78-49e9-b05a-08ff4c42f602"). InnerVolumeSpecName "kube-api-access-kd86c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.637262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b4afe958-0e78-49e9-b05a-08ff4c42f602" (UID: "b4afe958-0e78-49e9-b05a-08ff4c42f602"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.713574 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b4afe958-0e78-49e9-b05a-08ff4c42f602" (UID: "b4afe958-0e78-49e9-b05a-08ff4c42f602"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.714993 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd86c\" (UniqueName: \"kubernetes.io/projected/b4afe958-0e78-49e9-b05a-08ff4c42f602-kube-api-access-kd86c\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.715029 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.715041 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b4afe958-0e78-49e9-b05a-08ff4c42f602-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.718885 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"640e24e6-f89c-45ee-999a-e5aa0816aab2","Type":"ContainerStarted","Data":"fdad93376df51125a09e0b7e4c9aea575f5d0c65e55b1bcc0aad25c449686491"} Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.719559 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.725264 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 20 08:52:35 crc kubenswrapper[5094]: W0220 08:52:35.730163 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22516d8a_bb80_405e_8258_01fd733495ef.slice/crio-e3bfe29aea66f3f369d626303d40b9ea0a93f55fd4d02016e9ee891eca04c692 WatchSource:0}: Error finding container e3bfe29aea66f3f369d626303d40b9ea0a93f55fd4d02016e9ee891eca04c692: Status 404 returned error can't find the container with id e3bfe29aea66f3f369d626303d40b9ea0a93f55fd4d02016e9ee891eca04c692 Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.747507 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.224774564 podStartE2EDuration="2.747492025s" podCreationTimestamp="2026-02-20 08:52:33 +0000 UTC" firstStartedPulling="2026-02-20 08:52:34.62303149 +0000 UTC m=+7569.495658201" lastFinishedPulling="2026-02-20 08:52:35.145748951 +0000 UTC m=+7570.018375662" observedRunningTime="2026-02-20 08:52:35.746554112 +0000 UTC m=+7570.619180823" watchObservedRunningTime="2026-02-20 08:52:35.747492025 +0000 UTC m=+7570.620118736" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.754651 5094 generic.go:334] "Generic (PLEG): container finished" podID="b4afe958-0e78-49e9-b05a-08ff4c42f602" containerID="cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" exitCode=137 Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.754876 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.755655 5094 scope.go:117] "RemoveContainer" containerID="cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.760049 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerStarted","Data":"5d16b227891bdbf1374415e0506890350e5bcb7672f0639b3366d1fd133d340c"} Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.763428 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3c21f8d0-ca22-4206-9cdf-26edee70eac2","Type":"ContainerStarted","Data":"27267b5502fecc6a43056632683dfee580dedecd9796c21b815075674efcc79e"} Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.787895 5094 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" podUID="3c21f8d0-ca22-4206-9cdf-26edee70eac2" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.796937 5094 scope.go:117] "RemoveContainer" containerID="cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" Feb 20 08:52:35 crc kubenswrapper[5094]: E0220 08:52:35.797888 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181\": container with ID starting with cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181 not found: ID does not exist" containerID="cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.797946 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181"} err="failed to get container status \"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181\": rpc error: code = NotFound desc = could not find container \"cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181\": container with ID starting with cb4acd794ebf0ed75ac1a7120cae6e99da13e366872ae910d88b143eaec6e181 not found: ID does not exist" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.800313 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.800292575 podStartE2EDuration="3.800292575s" podCreationTimestamp="2026-02-20 08:52:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:52:35.781865412 +0000 UTC m=+7570.654492123" watchObservedRunningTime="2026-02-20 08:52:35.800292575 +0000 UTC m=+7570.672919286" Feb 20 08:52:35 crc kubenswrapper[5094]: I0220 08:52:35.853285 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4afe958-0e78-49e9-b05a-08ff4c42f602" path="/var/lib/kubelet/pods/b4afe958-0e78-49e9-b05a-08ff4c42f602/volumes" Feb 20 08:52:36 crc kubenswrapper[5094]: I0220 08:52:36.782052 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"e3bfe29aea66f3f369d626303d40b9ea0a93f55fd4d02016e9ee891eca04c692"} Feb 20 08:52:40 crc kubenswrapper[5094]: I0220 08:52:40.838850 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"d747fb06a90682657b4ae142d3d94a412957a924fc193ade70b70b69eda6b31a"} Feb 20 08:52:40 crc kubenswrapper[5094]: I0220 08:52:40.842551 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerStarted","Data":"db342d9226fdf538f0f218b110f06cb19147593032abe4184c062b1e1763c716"} Feb 20 08:52:40 crc kubenswrapper[5094]: I0220 08:52:40.897350 5094 scope.go:117] "RemoveContainer" containerID="5af35aa0d974ec2be3d578b66402a33233be4efbd611deaf5976f2b6d54c4e72" Feb 20 08:52:40 crc kubenswrapper[5094]: I0220 08:52:40.943884 5094 scope.go:117] "RemoveContainer" containerID="710b367e8a0475d2f89ae71b4ffcf7ae41da63c436f5361dbf27d4bd07bdf660" Feb 20 08:52:41 crc kubenswrapper[5094]: I0220 08:52:41.140873 5094 scope.go:117] "RemoveContainer" containerID="4495a0b785b56a81800453fd2516a41bac0676f202c2358f07c81e7849110742" Feb 20 08:52:43 crc kubenswrapper[5094]: I0220 08:52:43.765149 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.055937 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.068732 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-bgw44"] Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.851453 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d39890b-bbcb-4fcb-9f5e-6f74782fc661" path="/var/lib/kubelet/pods/5d39890b-bbcb-4fcb-9f5e-6f74782fc661/volumes" Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.918226 5094 generic.go:334] "Generic (PLEG): container finished" podID="22516d8a-bb80-405e-8258-01fd733495ef" containerID="d747fb06a90682657b4ae142d3d94a412957a924fc193ade70b70b69eda6b31a" exitCode=0 Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.918310 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerDied","Data":"d747fb06a90682657b4ae142d3d94a412957a924fc193ade70b70b69eda6b31a"} Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.921155 5094 generic.go:334] "Generic (PLEG): container finished" podID="c09fbf6b-1221-4e3d-b29d-6432848a564b" containerID="db342d9226fdf538f0f218b110f06cb19147593032abe4184c062b1e1763c716" exitCode=0 Feb 20 08:52:47 crc kubenswrapper[5094]: I0220 08:52:47.921187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerDied","Data":"db342d9226fdf538f0f218b110f06cb19147593032abe4184c062b1e1763c716"} Feb 20 08:52:48 crc kubenswrapper[5094]: I0220 08:52:48.039622 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:52:48 crc kubenswrapper[5094]: I0220 08:52:48.049494 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-80d8-account-create-update-2lhxz"] Feb 20 08:52:49 crc kubenswrapper[5094]: I0220 08:52:49.855506 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0fbd49a-25e7-44de-a81d-f324feba0dff" path="/var/lib/kubelet/pods/f0fbd49a-25e7-44de-a81d-f324feba0dff/volumes" Feb 20 08:52:50 crc kubenswrapper[5094]: I0220 08:52:50.953805 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerStarted","Data":"41f30873b500df1cc75c82c266570b32e40f31aa3508f4f529cfba349449edb0"} Feb 20 08:52:53 crc kubenswrapper[5094]: I0220 08:52:53.985785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"962cb6f946961a50a912015b26a5b50b4fca505d54ba58164b9ff45aa52116d8"} Feb 20 08:52:53 crc kubenswrapper[5094]: I0220 08:52:53.988411 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"c09fbf6b-1221-4e3d-b29d-6432848a564b","Type":"ContainerStarted","Data":"5abfa5fefa6943b0f00687b18c9cec4287624f61d4fa8fe28ac28fa89cc8b86a"} Feb 20 08:52:53 crc kubenswrapper[5094]: I0220 08:52:53.989262 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:53 crc kubenswrapper[5094]: I0220 08:52:53.991575 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 20 08:52:54 crc kubenswrapper[5094]: I0220 08:52:54.028889 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=5.820937738 podStartE2EDuration="21.028865546s" podCreationTimestamp="2026-02-20 08:52:33 +0000 UTC" firstStartedPulling="2026-02-20 08:52:35.03131773 +0000 UTC m=+7569.903944441" lastFinishedPulling="2026-02-20 08:52:50.239245548 +0000 UTC m=+7585.111872249" observedRunningTime="2026-02-20 08:52:54.021098619 +0000 UTC m=+7588.893725370" watchObservedRunningTime="2026-02-20 08:52:54.028865546 +0000 UTC m=+7588.901492257" Feb 20 08:52:58 crc kubenswrapper[5094]: I0220 08:52:58.037228 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"91e1b6af9e2520978b5d06e517655629b996d9d120f47e987ff5902a730f1390"} Feb 20 08:53:02 crc kubenswrapper[5094]: I0220 08:53:02.085904 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"22516d8a-bb80-405e-8258-01fd733495ef","Type":"ContainerStarted","Data":"7e2aefc274a4b17af715cf73d71ebe7aa89df2a703b9bb2948244f3ccf5dd608"} Feb 20 08:53:02 crc kubenswrapper[5094]: I0220 08:53:02.130513 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.374768993 podStartE2EDuration="29.130483886s" podCreationTimestamp="2026-02-20 08:52:33 +0000 UTC" firstStartedPulling="2026-02-20 08:52:35.758247944 +0000 UTC m=+7570.630874655" lastFinishedPulling="2026-02-20 08:53:01.513962837 +0000 UTC m=+7596.386589548" observedRunningTime="2026-02-20 08:53:02.111861158 +0000 UTC m=+7596.984487909" watchObservedRunningTime="2026-02-20 08:53:02.130483886 +0000 UTC m=+7597.003110627" Feb 20 08:53:05 crc kubenswrapper[5094]: I0220 08:53:05.154610 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 20 08:53:05 crc kubenswrapper[5094]: I0220 08:53:05.155313 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 20 08:53:05 crc kubenswrapper[5094]: I0220 08:53:05.159215 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 20 08:53:06 crc kubenswrapper[5094]: I0220 08:53:06.124356 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.363884 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.367197 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.369671 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.369991 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.377245 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481717 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481786 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481805 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481825 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481879 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.481968 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.482224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584589 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584679 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584734 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584756 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.584818 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.585050 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.585750 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.586409 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.591132 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.591618 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.592273 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.592580 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.602088 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") pod \"ceilometer-0\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " pod="openstack/ceilometer-0" Feb 20 08:53:08 crc kubenswrapper[5094]: I0220 08:53:08.690181 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:09 crc kubenswrapper[5094]: I0220 08:53:09.336290 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:10 crc kubenswrapper[5094]: I0220 08:53:10.161753 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"8d3046356c6ab6376d54ed64d8a87d4b911f9de1922da02bb0c47ec017f24b8c"} Feb 20 08:53:13 crc kubenswrapper[5094]: I0220 08:53:13.192421 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6"} Feb 20 08:53:14 crc kubenswrapper[5094]: I0220 08:53:14.205253 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87"} Feb 20 08:53:15 crc kubenswrapper[5094]: I0220 08:53:15.222445 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b"} Feb 20 08:53:17 crc kubenswrapper[5094]: I0220 08:53:17.246697 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerStarted","Data":"ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2"} Feb 20 08:53:17 crc kubenswrapper[5094]: I0220 08:53:17.248203 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 08:53:17 crc kubenswrapper[5094]: I0220 08:53:17.282408 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4558336069999998 podStartE2EDuration="9.282391067s" podCreationTimestamp="2026-02-20 08:53:08 +0000 UTC" firstStartedPulling="2026-02-20 08:53:09.331508154 +0000 UTC m=+7604.204134865" lastFinishedPulling="2026-02-20 08:53:16.158065604 +0000 UTC m=+7611.030692325" observedRunningTime="2026-02-20 08:53:17.268658356 +0000 UTC m=+7612.141285067" watchObservedRunningTime="2026-02-20 08:53:17.282391067 +0000 UTC m=+7612.155017768" Feb 20 08:53:20 crc kubenswrapper[5094]: I0220 08:53:20.049971 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:53:20 crc kubenswrapper[5094]: I0220 08:53:20.060745 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-2fzgg"] Feb 20 08:53:21 crc kubenswrapper[5094]: I0220 08:53:21.858985 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b382ec69-4b87-43f5-b964-eba4282bcc42" path="/var/lib/kubelet/pods/b382ec69-4b87-43f5-b964-eba4282bcc42/volumes" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.176937 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.178832 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.191718 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.239496 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.239605 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.307574 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.309184 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.311787 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.317823 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.340869 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.340923 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.340962 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.341002 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.341746 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.363588 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") pod \"aodh-db-create-drwqv\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.442930 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.442991 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.443609 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.471548 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") pod \"aodh-4eea-account-create-update-rqsxn\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.573461 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:25 crc kubenswrapper[5094]: I0220 08:53:25.629857 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.064471 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.182332 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 08:53:26 crc kubenswrapper[5094]: W0220 08:53:26.185105 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod537981f5_8e74_406f_9199_8bac8aa60903.slice/crio-1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695 WatchSource:0}: Error finding container 1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695: Status 404 returned error can't find the container with id 1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695 Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.190810 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.352266 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4eea-account-create-update-rqsxn" event={"ID":"537981f5-8e74-406f-9199-8bac8aa60903","Type":"ContainerStarted","Data":"1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695"} Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.354044 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-drwqv" event={"ID":"ae290c11-18c8-4d9a-90d3-8f2219084a78","Type":"ContainerStarted","Data":"7d6c489e10960f3fa343c9c151323e484166523ad5c54d095e27280ce6e1cbd6"} Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.354068 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-drwqv" event={"ID":"ae290c11-18c8-4d9a-90d3-8f2219084a78","Type":"ContainerStarted","Data":"94e8abe12427ae34e22b9ee3dcb41fee4c0f9b52a7ed17799a36e248a73e58e5"} Feb 20 08:53:26 crc kubenswrapper[5094]: I0220 08:53:26.369302 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-drwqv" podStartSLOduration=1.369284483 podStartE2EDuration="1.369284483s" podCreationTimestamp="2026-02-20 08:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:53:26.368155236 +0000 UTC m=+7621.240781947" watchObservedRunningTime="2026-02-20 08:53:26.369284483 +0000 UTC m=+7621.241911194" Feb 20 08:53:27 crc kubenswrapper[5094]: I0220 08:53:27.364168 5094 generic.go:334] "Generic (PLEG): container finished" podID="537981f5-8e74-406f-9199-8bac8aa60903" containerID="d3221fcda11fc25108efa9fb80c6774c8d350491f8d20f83e1f5fae473f8e306" exitCode=0 Feb 20 08:53:27 crc kubenswrapper[5094]: I0220 08:53:27.364210 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4eea-account-create-update-rqsxn" event={"ID":"537981f5-8e74-406f-9199-8bac8aa60903","Type":"ContainerDied","Data":"d3221fcda11fc25108efa9fb80c6774c8d350491f8d20f83e1f5fae473f8e306"} Feb 20 08:53:27 crc kubenswrapper[5094]: I0220 08:53:27.366460 5094 generic.go:334] "Generic (PLEG): container finished" podID="ae290c11-18c8-4d9a-90d3-8f2219084a78" containerID="7d6c489e10960f3fa343c9c151323e484166523ad5c54d095e27280ce6e1cbd6" exitCode=0 Feb 20 08:53:27 crc kubenswrapper[5094]: I0220 08:53:27.366493 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-drwqv" event={"ID":"ae290c11-18c8-4d9a-90d3-8f2219084a78","Type":"ContainerDied","Data":"7d6c489e10960f3fa343c9c151323e484166523ad5c54d095e27280ce6e1cbd6"} Feb 20 08:53:28 crc kubenswrapper[5094]: I0220 08:53:28.892966 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:28 crc kubenswrapper[5094]: I0220 08:53:28.897944 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.012750 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") pod \"ae290c11-18c8-4d9a-90d3-8f2219084a78\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.012801 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") pod \"537981f5-8e74-406f-9199-8bac8aa60903\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.012843 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") pod \"ae290c11-18c8-4d9a-90d3-8f2219084a78\" (UID: \"ae290c11-18c8-4d9a-90d3-8f2219084a78\") " Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.012972 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") pod \"537981f5-8e74-406f-9199-8bac8aa60903\" (UID: \"537981f5-8e74-406f-9199-8bac8aa60903\") " Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.013699 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "537981f5-8e74-406f-9199-8bac8aa60903" (UID: "537981f5-8e74-406f-9199-8bac8aa60903"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.014404 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ae290c11-18c8-4d9a-90d3-8f2219084a78" (UID: "ae290c11-18c8-4d9a-90d3-8f2219084a78"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.018936 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967" (OuterVolumeSpecName: "kube-api-access-hl967") pod "ae290c11-18c8-4d9a-90d3-8f2219084a78" (UID: "ae290c11-18c8-4d9a-90d3-8f2219084a78"). InnerVolumeSpecName "kube-api-access-hl967". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.019594 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz" (OuterVolumeSpecName: "kube-api-access-gbtzz") pod "537981f5-8e74-406f-9199-8bac8aa60903" (UID: "537981f5-8e74-406f-9199-8bac8aa60903"). InnerVolumeSpecName "kube-api-access-gbtzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.115634 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ae290c11-18c8-4d9a-90d3-8f2219084a78-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.116013 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/537981f5-8e74-406f-9199-8bac8aa60903-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.116034 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl967\" (UniqueName: \"kubernetes.io/projected/ae290c11-18c8-4d9a-90d3-8f2219084a78-kube-api-access-hl967\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.116053 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbtzz\" (UniqueName: \"kubernetes.io/projected/537981f5-8e74-406f-9199-8bac8aa60903-kube-api-access-gbtzz\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.383785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-4eea-account-create-update-rqsxn" event={"ID":"537981f5-8e74-406f-9199-8bac8aa60903","Type":"ContainerDied","Data":"1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695"} Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.383832 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5a211997fe0479356df9c5ca80e3866ae06de0c730278dd3a3e5750105b695" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.383836 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-4eea-account-create-update-rqsxn" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.387116 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-drwqv" event={"ID":"ae290c11-18c8-4d9a-90d3-8f2219084a78","Type":"ContainerDied","Data":"94e8abe12427ae34e22b9ee3dcb41fee4c0f9b52a7ed17799a36e248a73e58e5"} Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.387163 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e8abe12427ae34e22b9ee3dcb41fee4c0f9b52a7ed17799a36e248a73e58e5" Feb 20 08:53:29 crc kubenswrapper[5094]: I0220 08:53:29.387274 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-drwqv" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.634064 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 08:53:30 crc kubenswrapper[5094]: E0220 08:53:30.634838 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="537981f5-8e74-406f-9199-8bac8aa60903" containerName="mariadb-account-create-update" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.634852 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="537981f5-8e74-406f-9199-8bac8aa60903" containerName="mariadb-account-create-update" Feb 20 08:53:30 crc kubenswrapper[5094]: E0220 08:53:30.634872 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae290c11-18c8-4d9a-90d3-8f2219084a78" containerName="mariadb-database-create" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.634878 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae290c11-18c8-4d9a-90d3-8f2219084a78" containerName="mariadb-database-create" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.635079 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="537981f5-8e74-406f-9199-8bac8aa60903" containerName="mariadb-account-create-update" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.635095 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae290c11-18c8-4d9a-90d3-8f2219084a78" containerName="mariadb-database-create" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.635891 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.641170 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.644101 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.644851 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.646578 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.646845 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-zl6jq" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.766118 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.766170 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.766615 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.766840 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.868319 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.868457 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.868496 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.869823 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.876640 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.878527 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.883604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.885207 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") pod \"aodh-db-sync-bg4qh\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:30 crc kubenswrapper[5094]: I0220 08:53:30.951498 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:31 crc kubenswrapper[5094]: W0220 08:53:31.610872 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2950b502_5079_4a08_8aaf_f0b5d376a3f2.slice/crio-0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15 WatchSource:0}: Error finding container 0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15: Status 404 returned error can't find the container with id 0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15 Feb 20 08:53:31 crc kubenswrapper[5094]: I0220 08:53:31.614017 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:53:31 crc kubenswrapper[5094]: I0220 08:53:31.627363 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 08:53:32 crc kubenswrapper[5094]: I0220 08:53:32.423831 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bg4qh" event={"ID":"2950b502-5079-4a08-8aaf-f0b5d376a3f2","Type":"ContainerStarted","Data":"0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15"} Feb 20 08:53:37 crc kubenswrapper[5094]: I0220 08:53:37.467155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bg4qh" event={"ID":"2950b502-5079-4a08-8aaf-f0b5d376a3f2","Type":"ContainerStarted","Data":"b71c497029e9f5437c06dfb05b6008f8e4f7c93cd886b8f44f838d87660036a7"} Feb 20 08:53:37 crc kubenswrapper[5094]: I0220 08:53:37.493604 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-bg4qh" podStartSLOduration=2.411351447 podStartE2EDuration="7.493581694s" podCreationTimestamp="2026-02-20 08:53:30 +0000 UTC" firstStartedPulling="2026-02-20 08:53:31.613564868 +0000 UTC m=+7626.486191619" lastFinishedPulling="2026-02-20 08:53:36.695795155 +0000 UTC m=+7631.568421866" observedRunningTime="2026-02-20 08:53:37.480576461 +0000 UTC m=+7632.353203182" watchObservedRunningTime="2026-02-20 08:53:37.493581694 +0000 UTC m=+7632.366208415" Feb 20 08:53:38 crc kubenswrapper[5094]: I0220 08:53:38.703465 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 08:53:40 crc kubenswrapper[5094]: I0220 08:53:40.493991 5094 generic.go:334] "Generic (PLEG): container finished" podID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" containerID="b71c497029e9f5437c06dfb05b6008f8e4f7c93cd886b8f44f838d87660036a7" exitCode=0 Feb 20 08:53:40 crc kubenswrapper[5094]: I0220 08:53:40.494059 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bg4qh" event={"ID":"2950b502-5079-4a08-8aaf-f0b5d376a3f2","Type":"ContainerDied","Data":"b71c497029e9f5437c06dfb05b6008f8e4f7c93cd886b8f44f838d87660036a7"} Feb 20 08:53:41 crc kubenswrapper[5094]: I0220 08:53:41.410895 5094 scope.go:117] "RemoveContainer" containerID="26fb92ef592af820040ca30c6c02e7ed550cd4c5268296895eb9386dd13d2c0a" Feb 20 08:53:41 crc kubenswrapper[5094]: I0220 08:53:41.435799 5094 scope.go:117] "RemoveContainer" containerID="0517d39a1199d93e907c142d0fd23dddc068f1fcc10b13d41993d7946b0ef46a" Feb 20 08:53:41 crc kubenswrapper[5094]: I0220 08:53:41.489401 5094 scope.go:117] "RemoveContainer" containerID="d77d5b604322e5a963ae828151741fdab64a97bfd2a29b72e3a01f5ffe6ac7d2" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.018818 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.160932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") pod \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.160987 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") pod \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.161126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") pod \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.161144 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") pod \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\" (UID: \"2950b502-5079-4a08-8aaf-f0b5d376a3f2\") " Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.166077 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts" (OuterVolumeSpecName: "scripts") pod "2950b502-5079-4a08-8aaf-f0b5d376a3f2" (UID: "2950b502-5079-4a08-8aaf-f0b5d376a3f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.166367 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz" (OuterVolumeSpecName: "kube-api-access-l9rjz") pod "2950b502-5079-4a08-8aaf-f0b5d376a3f2" (UID: "2950b502-5079-4a08-8aaf-f0b5d376a3f2"). InnerVolumeSpecName "kube-api-access-l9rjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.188661 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2950b502-5079-4a08-8aaf-f0b5d376a3f2" (UID: "2950b502-5079-4a08-8aaf-f0b5d376a3f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.190058 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data" (OuterVolumeSpecName: "config-data") pod "2950b502-5079-4a08-8aaf-f0b5d376a3f2" (UID: "2950b502-5079-4a08-8aaf-f0b5d376a3f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.263385 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.263415 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.263426 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9rjz\" (UniqueName: \"kubernetes.io/projected/2950b502-5079-4a08-8aaf-f0b5d376a3f2-kube-api-access-l9rjz\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.263437 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2950b502-5079-4a08-8aaf-f0b5d376a3f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.526539 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-bg4qh" event={"ID":"2950b502-5079-4a08-8aaf-f0b5d376a3f2","Type":"ContainerDied","Data":"0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15"} Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.526611 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c2efecd63bd3804e02924264075112d6f9b735af119df1237adc505949e0e15" Feb 20 08:53:42 crc kubenswrapper[5094]: I0220 08:53:42.526797 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-bg4qh" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.240746 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Feb 20 08:53:45 crc kubenswrapper[5094]: E0220 08:53:45.243150 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" containerName="aodh-db-sync" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.243178 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" containerName="aodh-db-sync" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.243406 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" containerName="aodh-db-sync" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.246836 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.251188 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.251200 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-zl6jq" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.251400 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.260285 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.334477 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpjvq\" (UniqueName: \"kubernetes.io/projected/4f38802d-49cb-413c-ac61-665d5c77a1a3-kube-api-access-wpjvq\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.334581 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-scripts\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.334666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-config-data\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.334763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.436247 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.436376 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpjvq\" (UniqueName: \"kubernetes.io/projected/4f38802d-49cb-413c-ac61-665d5c77a1a3-kube-api-access-wpjvq\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.436434 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-scripts\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.436560 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-config-data\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.444558 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.448452 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-config-data\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.454045 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f38802d-49cb-413c-ac61-665d5c77a1a3-scripts\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.455740 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpjvq\" (UniqueName: \"kubernetes.io/projected/4f38802d-49cb-413c-ac61-665d5c77a1a3-kube-api-access-wpjvq\") pod \"aodh-0\" (UID: \"4f38802d-49cb-413c-ac61-665d5c77a1a3\") " pod="openstack/aodh-0" Feb 20 08:53:45 crc kubenswrapper[5094]: I0220 08:53:45.573883 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Feb 20 08:53:46 crc kubenswrapper[5094]: I0220 08:53:46.089283 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Feb 20 08:53:46 crc kubenswrapper[5094]: I0220 08:53:46.561524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"730b6ecf3ba7d016cc450efeb07b413166b9838f947281daf1abfc31fde01ce8"} Feb 20 08:53:46 crc kubenswrapper[5094]: I0220 08:53:46.561901 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"3237a18f11e311c42d21d546d8c4ea50d1f8ff2f86a649e7d2c75e24ae2594ef"} Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498251 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498877 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-central-agent" containerID="cri-o://65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6" gracePeriod=30 Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498902 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="proxy-httpd" containerID="cri-o://ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2" gracePeriod=30 Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498961 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-notification-agent" containerID="cri-o://2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87" gracePeriod=30 Feb 20 08:53:47 crc kubenswrapper[5094]: I0220 08:53:47.498952 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="sg-core" containerID="cri-o://36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b" gracePeriod=30 Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.599918 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerID="ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2" exitCode=0 Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601117 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2"} Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601118 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerID="36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b" exitCode=2 Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601150 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerID="65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6" exitCode=0 Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601361 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b"} Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.601380 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6"} Feb 20 08:53:48 crc kubenswrapper[5094]: I0220 08:53:48.610087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"9ac908b0382fd3ade38084ba83187b903fd00f472508cca86774b10ee2f78f8c"} Feb 20 08:53:49 crc kubenswrapper[5094]: I0220 08:53:49.620963 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"a0a9eae3512bc254e98755e973d0349cd0928b410aed8d756eb5fb675b1e04f4"} Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.041437 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.053901 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-63f8-account-create-update-szqmc"] Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.634750 5094 generic.go:334] "Generic (PLEG): container finished" podID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerID="2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87" exitCode=0 Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.634810 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87"} Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.641238 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"4f38802d-49cb-413c-ac61-665d5c77a1a3","Type":"ContainerStarted","Data":"d880101af76eba9ae1c04df323918f531d2f04c0e84bcf88b07c2a52d7a1c082"} Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.662321 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.945613372 podStartE2EDuration="5.662295096s" podCreationTimestamp="2026-02-20 08:53:45 +0000 UTC" firstStartedPulling="2026-02-20 08:53:46.096546231 +0000 UTC m=+7640.969172942" lastFinishedPulling="2026-02-20 08:53:49.813227955 +0000 UTC m=+7644.685854666" observedRunningTime="2026-02-20 08:53:50.659179591 +0000 UTC m=+7645.531806302" watchObservedRunningTime="2026-02-20 08:53:50.662295096 +0000 UTC m=+7645.534921807" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.740571 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836810 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836875 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836930 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.836957 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.837000 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.837027 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") pod \"3d5a7349-432b-4431-bbf3-5079cbad3819\" (UID: \"3d5a7349-432b-4431-bbf3-5079cbad3819\") " Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.840306 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.843634 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.846370 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq" (OuterVolumeSpecName: "kube-api-access-wb4wq") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "kube-api-access-wb4wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.847304 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts" (OuterVolumeSpecName: "scripts") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.892354 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.915753 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.939983 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940030 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb4wq\" (UniqueName: \"kubernetes.io/projected/3d5a7349-432b-4431-bbf3-5079cbad3819-kube-api-access-wb4wq\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940044 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940057 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940067 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3d5a7349-432b-4431-bbf3-5079cbad3819-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.940076 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:50 crc kubenswrapper[5094]: I0220 08:53:50.986842 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data" (OuterVolumeSpecName: "config-data") pod "3d5a7349-432b-4431-bbf3-5079cbad3819" (UID: "3d5a7349-432b-4431-bbf3-5079cbad3819"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.033314 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.042390 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d5a7349-432b-4431-bbf3-5079cbad3819-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.045579 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-r5qjd"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.656808 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.656821 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3d5a7349-432b-4431-bbf3-5079cbad3819","Type":"ContainerDied","Data":"8d3046356c6ab6376d54ed64d8a87d4b911f9de1922da02bb0c47ec017f24b8c"} Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.657489 5094 scope.go:117] "RemoveContainer" containerID="ebf23270078580538e17397e37d3095ef7fab2a602d5fa4d1a4d6c4d39282be2" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.679502 5094 scope.go:117] "RemoveContainer" containerID="36383798cb5cce5f003b1a93a9f58a7a048970022e6f16ba4a5b228fc5f0ee8b" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.694853 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.707873 5094 scope.go:117] "RemoveContainer" containerID="2857012eb54d9c2168b148c823755eb27da69fbf090698f1429d8cb1752e4e87" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.709029 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.729763 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:51 crc kubenswrapper[5094]: E0220 08:53:51.730191 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="proxy-httpd" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730213 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="proxy-httpd" Feb 20 08:53:51 crc kubenswrapper[5094]: E0220 08:53:51.730239 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="sg-core" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730246 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="sg-core" Feb 20 08:53:51 crc kubenswrapper[5094]: E0220 08:53:51.730262 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-notification-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730270 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-notification-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: E0220 08:53:51.730279 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-central-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730286 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-central-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730494 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-notification-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730510 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="proxy-httpd" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730519 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="sg-core" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.730532 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" containerName="ceilometer-central-agent" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.732283 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.748340 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.765623 5094 scope.go:117] "RemoveContainer" containerID="65a8886591306ad8879b6a7a385867de5351348368c26ea731d0f97822cb7df6" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.767118 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.767374 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.767246 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.768838 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769015 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769149 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769220 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769287 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.769462 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.855969 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e67bd4c-454a-4166-9e28-49c348795b29" path="/var/lib/kubelet/pods/0e67bd4c-454a-4166-9e28-49c348795b29/volumes" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.856729 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d5a7349-432b-4431-bbf3-5079cbad3819" path="/var/lib/kubelet/pods/3d5a7349-432b-4431-bbf3-5079cbad3819/volumes" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.857542 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5bcef59-b989-4157-8233-6482f9f3abab" path="/var/lib/kubelet/pods/d5bcef59-b989-4157-8233-6482f9f3abab/volumes" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870781 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870812 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870905 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870967 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.870986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.873922 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.874390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.897032 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.897266 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.900283 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.901160 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:51 crc kubenswrapper[5094]: I0220 08:53:51.913325 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") pod \"ceilometer-0\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " pod="openstack/ceilometer-0" Feb 20 08:53:52 crc kubenswrapper[5094]: I0220 08:53:52.098664 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:53:52 crc kubenswrapper[5094]: I0220 08:53:52.583390 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:53:52 crc kubenswrapper[5094]: W0220 08:53:52.583394 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4faa0d21_cfca_4eae_a05a_3ec287395c30.slice/crio-4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8 WatchSource:0}: Error finding container 4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8: Status 404 returned error can't find the container with id 4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8 Feb 20 08:53:52 crc kubenswrapper[5094]: I0220 08:53:52.671038 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8"} Feb 20 08:53:53 crc kubenswrapper[5094]: I0220 08:53:53.684876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51"} Feb 20 08:53:53 crc kubenswrapper[5094]: I0220 08:53:53.685199 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1"} Feb 20 08:53:54 crc kubenswrapper[5094]: I0220 08:53:54.695395 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164"} Feb 20 08:53:55 crc kubenswrapper[5094]: I0220 08:53:55.711547 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerStarted","Data":"68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec"} Feb 20 08:53:55 crc kubenswrapper[5094]: I0220 08:53:55.712214 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 08:53:55 crc kubenswrapper[5094]: I0220 08:53:55.744334 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.017267019 podStartE2EDuration="4.744095363s" podCreationTimestamp="2026-02-20 08:53:51 +0000 UTC" firstStartedPulling="2026-02-20 08:53:52.585519254 +0000 UTC m=+7647.458145965" lastFinishedPulling="2026-02-20 08:53:55.312347598 +0000 UTC m=+7650.184974309" observedRunningTime="2026-02-20 08:53:55.737660498 +0000 UTC m=+7650.610287209" watchObservedRunningTime="2026-02-20 08:53:55.744095363 +0000 UTC m=+7650.616722074" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.115999 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.117545 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.135772 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.262610 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.262748 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.325414 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.326921 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.329791 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.335214 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.364640 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.364784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.365571 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.388538 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") pod \"manila-db-create-mtnn8\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.437126 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mtnn8" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.466248 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.466782 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.569382 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.569462 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.570923 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.589632 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") pod \"manila-560b-account-create-update-6s8n2\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.646473 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:53:57 crc kubenswrapper[5094]: I0220 08:53:57.965801 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 08:53:57 crc kubenswrapper[5094]: W0220 08:53:57.974478 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5cc3f76_dfd1_4d7c_8adb_08cbc55636a3.slice/crio-4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40 WatchSource:0}: Error finding container 4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40: Status 404 returned error can't find the container with id 4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40 Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.168534 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 08:53:58 crc kubenswrapper[5094]: W0220 08:53:58.173176 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bad0291_15d9_4dc5_acd6_26bc8d8aad76.slice/crio-cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d WatchSource:0}: Error finding container cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d: Status 404 returned error can't find the container with id cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.748328 5094 generic.go:334] "Generic (PLEG): container finished" podID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" containerID="fcec52d45c535185ac325065c4cab11c829c4a1ebad6b2123939c3a35f4b9360" exitCode=0 Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.748409 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-560b-account-create-update-6s8n2" event={"ID":"6bad0291-15d9-4dc5-acd6-26bc8d8aad76","Type":"ContainerDied","Data":"fcec52d45c535185ac325065c4cab11c829c4a1ebad6b2123939c3a35f4b9360"} Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.748734 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-560b-account-create-update-6s8n2" event={"ID":"6bad0291-15d9-4dc5-acd6-26bc8d8aad76","Type":"ContainerStarted","Data":"cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d"} Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.749923 5094 generic.go:334] "Generic (PLEG): container finished" podID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" containerID="40e80a7f49d2a8cd8ede69f04221413d74bc3298b5502921432bc86e342a4f7d" exitCode=0 Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.749959 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-mtnn8" event={"ID":"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3","Type":"ContainerDied","Data":"40e80a7f49d2a8cd8ede69f04221413d74bc3298b5502921432bc86e342a4f7d"} Feb 20 08:53:58 crc kubenswrapper[5094]: I0220 08:53:58.749974 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-mtnn8" event={"ID":"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3","Type":"ContainerStarted","Data":"4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40"} Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.067143 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.128738 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-smd54"] Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.446754 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mtnn8" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.451288 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.536578 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") pod \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.536670 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") pod \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.536746 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") pod \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\" (UID: \"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3\") " Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.536811 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") pod \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\" (UID: \"6bad0291-15d9-4dc5-acd6-26bc8d8aad76\") " Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.537173 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" (UID: "c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.537236 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bad0291-15d9-4dc5-acd6-26bc8d8aad76" (UID: "6bad0291-15d9-4dc5-acd6-26bc8d8aad76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.545979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm" (OuterVolumeSpecName: "kube-api-access-6cxwm") pod "6bad0291-15d9-4dc5-acd6-26bc8d8aad76" (UID: "6bad0291-15d9-4dc5-acd6-26bc8d8aad76"). InnerVolumeSpecName "kube-api-access-6cxwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.550285 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd" (OuterVolumeSpecName: "kube-api-access-wnfgd") pod "c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" (UID: "c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3"). InnerVolumeSpecName "kube-api-access-wnfgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.641237 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.641282 5094 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.641295 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnfgd\" (UniqueName: \"kubernetes.io/projected/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3-kube-api-access-wnfgd\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.641308 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cxwm\" (UniqueName: \"kubernetes.io/projected/6bad0291-15d9-4dc5-acd6-26bc8d8aad76-kube-api-access-6cxwm\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.779079 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-560b-account-create-update-6s8n2" event={"ID":"6bad0291-15d9-4dc5-acd6-26bc8d8aad76","Type":"ContainerDied","Data":"cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d"} Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.779127 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf6c4ea050a878c441827f0883727cadb0b05d7a626b34e4059bc274c7f4322d" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.779186 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-560b-account-create-update-6s8n2" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.783145 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-mtnn8" event={"ID":"c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3","Type":"ContainerDied","Data":"4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40"} Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.783194 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fb224ed39ff04c2136c9646f40baaacccbff7b5c61381309178d49896dc2b40" Feb 20 08:54:00 crc kubenswrapper[5094]: I0220 08:54:00.783436 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-mtnn8" Feb 20 08:54:01 crc kubenswrapper[5094]: I0220 08:54:01.853016 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63f3e88-3e2a-43db-88de-8cf778187671" path="/var/lib/kubelet/pods/b63f3e88-3e2a-43db-88de-8cf778187671/volumes" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.774407 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 08:54:02 crc kubenswrapper[5094]: E0220 08:54:02.775503 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" containerName="mariadb-database-create" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.775609 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" containerName="mariadb-database-create" Feb 20 08:54:02 crc kubenswrapper[5094]: E0220 08:54:02.775753 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" containerName="mariadb-account-create-update" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.775845 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" containerName="mariadb-account-create-update" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.776169 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" containerName="mariadb-account-create-update" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.776284 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" containerName="mariadb-database-create" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.777309 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.781478 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-zwzzl" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.783001 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.789684 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.892179 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.892414 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.892510 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.892539 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.993885 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.994205 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.994945 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:02 crc kubenswrapper[5094]: I0220 08:54:02.995045 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.002094 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.002633 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.003813 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.021870 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") pod \"manila-db-sync-87lwb\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.117476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:03 crc kubenswrapper[5094]: I0220 08:54:03.947978 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 08:54:04 crc kubenswrapper[5094]: I0220 08:54:04.106883 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:54:04 crc kubenswrapper[5094]: I0220 08:54:04.106936 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:54:04 crc kubenswrapper[5094]: I0220 08:54:04.822139 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-87lwb" event={"ID":"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00","Type":"ContainerStarted","Data":"ab10d89216ed579b6386deb45a07ae8b41f6f27357e9c40b4d0986f26b63851f"} Feb 20 08:54:10 crc kubenswrapper[5094]: I0220 08:54:10.881163 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-87lwb" event={"ID":"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00","Type":"ContainerStarted","Data":"8325d32b9b097dfce89183c5462e4d5dad44f1febeae290e0597bb55bbe59825"} Feb 20 08:54:10 crc kubenswrapper[5094]: I0220 08:54:10.898671 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-87lwb" podStartSLOduration=3.115585194 podStartE2EDuration="8.898653328s" podCreationTimestamp="2026-02-20 08:54:02 +0000 UTC" firstStartedPulling="2026-02-20 08:54:03.964260663 +0000 UTC m=+7658.836887374" lastFinishedPulling="2026-02-20 08:54:09.747328787 +0000 UTC m=+7664.619955508" observedRunningTime="2026-02-20 08:54:10.894669213 +0000 UTC m=+7665.767295924" watchObservedRunningTime="2026-02-20 08:54:10.898653328 +0000 UTC m=+7665.771280039" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.778043 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.781880 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.796601 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.880126 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.880522 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.880725 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.982789 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.982932 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.983117 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.983350 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:11 crc kubenswrapper[5094]: I0220 08:54:11.983432 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.013752 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") pod \"community-operators-zrlvp\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.149100 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.599752 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:12 crc kubenswrapper[5094]: W0220 08:54:12.602558 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f62335a_f07f_45c0_9db2_5fbb91ed2588.slice/crio-d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb WatchSource:0}: Error finding container d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb: Status 404 returned error can't find the container with id d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.962508 5094 generic.go:334] "Generic (PLEG): container finished" podID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerID="5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966" exitCode=0 Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.962885 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerDied","Data":"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966"} Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.962962 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerStarted","Data":"d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb"} Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.967874 5094 generic.go:334] "Generic (PLEG): container finished" podID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" containerID="8325d32b9b097dfce89183c5462e4d5dad44f1febeae290e0597bb55bbe59825" exitCode=0 Feb 20 08:54:12 crc kubenswrapper[5094]: I0220 08:54:12.967934 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-87lwb" event={"ID":"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00","Type":"ContainerDied","Data":"8325d32b9b097dfce89183c5462e4d5dad44f1febeae290e0597bb55bbe59825"} Feb 20 08:54:13 crc kubenswrapper[5094]: I0220 08:54:13.978490 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerStarted","Data":"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f"} Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.447144 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.535353 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") pod \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.535436 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") pod \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.535692 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") pod \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.535814 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") pod \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\" (UID: \"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00\") " Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.547679 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" (UID: "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.547841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr" (OuterVolumeSpecName: "kube-api-access-6mdxr") pod "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" (UID: "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00"). InnerVolumeSpecName "kube-api-access-6mdxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.547923 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data" (OuterVolumeSpecName: "config-data") pod "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" (UID: "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.574669 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" (UID: "8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.637685 5094 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-job-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.637902 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.637967 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:14 crc kubenswrapper[5094]: I0220 08:54:14.638031 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mdxr\" (UniqueName: \"kubernetes.io/projected/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00-kube-api-access-6mdxr\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.010598 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-87lwb" event={"ID":"8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00","Type":"ContainerDied","Data":"ab10d89216ed579b6386deb45a07ae8b41f6f27357e9c40b4d0986f26b63851f"} Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.010661 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab10d89216ed579b6386deb45a07ae8b41f6f27357e9c40b4d0986f26b63851f" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.011998 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-87lwb" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.450214 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: E0220 08:54:15.451208 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" containerName="manila-db-sync" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.451327 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" containerName="manila-db-sync" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.451650 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" containerName="manila-db-sync" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.453237 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.457587 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.457864 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-zwzzl" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.458260 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.458548 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.468802 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.471002 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.475217 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.491215 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.513207 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.539644 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.541801 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.550219 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.566944 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567008 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29ad14f6-de76-4992-b46a-29f0822654c7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567078 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567190 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-ceph\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567208 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567249 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567264 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567735 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567796 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-scripts\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567908 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkj2\" (UniqueName: \"kubernetes.io/projected/29ad14f6-de76-4992-b46a-29f0822654c7-kube-api-access-skkj2\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567950 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmjhb\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-kube-api-access-zmjhb\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.567984 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.568013 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.568095 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-scripts\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.669809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670101 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmjhb\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-kube-api-access-zmjhb\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670216 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670322 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670461 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-scripts\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670572 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670741 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29ad14f6-de76-4992-b46a-29f0822654c7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670841 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670923 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671011 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.670760 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29ad14f6-de76-4992-b46a-29f0822654c7-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671161 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671284 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671364 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-ceph\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671457 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671537 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671643 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671779 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-scripts\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671878 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.671971 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skkj2\" (UniqueName: \"kubernetes.io/projected/29ad14f6-de76-4992-b46a-29f0822654c7-kube-api-access-skkj2\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.680612 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-ceph\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.681153 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-scripts\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.682686 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.682766 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.684056 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.690463 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.690888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.690893 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.691206 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-scripts\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.691572 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29ad14f6-de76-4992-b46a-29f0822654c7-config-data\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.697893 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkj2\" (UniqueName: \"kubernetes.io/projected/29ad14f6-de76-4992-b46a-29f0822654c7-kube-api-access-skkj2\") pod \"manila-scheduler-0\" (UID: \"29ad14f6-de76-4992-b46a-29f0822654c7\") " pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.698915 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmjhb\" (UniqueName: \"kubernetes.io/projected/e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b-kube-api-access-zmjhb\") pod \"manila-share-share1-0\" (UID: \"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b\") " pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.748697 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.751036 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.766248 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.773948 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.774107 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.774166 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.774273 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.774315 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.776123 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.776213 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.776236 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.776745 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.783649 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.784383 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.794412 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.803680 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") pod \"dnsmasq-dns-58b99f595-hbwr4\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879703 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879809 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-scripts\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879854 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc6c3c7a-374b-49fc-98d5-852785c56ee7-etc-machine-id\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879919 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdksh\" (UniqueName: \"kubernetes.io/projected/fc6c3c7a-374b-49fc-98d5-852785c56ee7-kube-api-access-qdksh\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data-custom\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.879960 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc6c3c7a-374b-49fc-98d5-852785c56ee7-logs\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.880375 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.982228 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc6c3c7a-374b-49fc-98d5-852785c56ee7-etc-machine-id\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.981940 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fc6c3c7a-374b-49fc-98d5-852785c56ee7-etc-machine-id\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.982999 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdksh\" (UniqueName: \"kubernetes.io/projected/fc6c3c7a-374b-49fc-98d5-852785c56ee7-kube-api-access-qdksh\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.983395 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data-custom\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc6c3c7a-374b-49fc-98d5-852785c56ee7-logs\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc6c3c7a-374b-49fc-98d5-852785c56ee7-logs\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984502 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984665 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.984856 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-scripts\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.990847 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-scripts\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.991719 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data-custom\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.994990 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:15 crc kubenswrapper[5094]: I0220 08:54:15.995621 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc6c3c7a-374b-49fc-98d5-852785c56ee7-config-data\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.006815 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdksh\" (UniqueName: \"kubernetes.io/projected/fc6c3c7a-374b-49fc-98d5-852785c56ee7-kube-api-access-qdksh\") pod \"manila-api-0\" (UID: \"fc6c3c7a-374b-49fc-98d5-852785c56ee7\") " pod="openstack/manila-api-0" Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.265519 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.679227 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.715349 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:54:16 crc kubenswrapper[5094]: I0220 08:54:16.797402 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.038644 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b","Type":"ContainerStarted","Data":"85208af740c940e58fba2853db780aefef08364e87cd43dea1d7ff6e20571fce"} Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.041013 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"29ad14f6-de76-4992-b46a-29f0822654c7","Type":"ContainerStarted","Data":"8a6b028345388dd91c8ed8789e2f8c0b1b3c4e5ec81e75142289bb1a80a2f173"} Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.043240 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerStarted","Data":"0a945c3e150afe6769cf7a5b87b40b125560d57829dfe7b7047e2bf325ea9a2b"} Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.045963 5094 generic.go:334] "Generic (PLEG): container finished" podID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerID="e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f" exitCode=0 Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.046007 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerDied","Data":"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f"} Feb 20 08:54:17 crc kubenswrapper[5094]: I0220 08:54:17.229298 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Feb 20 08:54:17 crc kubenswrapper[5094]: W0220 08:54:17.239887 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc6c3c7a_374b_49fc_98d5_852785c56ee7.slice/crio-72e41d453e08da9623bc506188ffe6016255c278e2aa0fb2dc8ffe38d2515827 WatchSource:0}: Error finding container 72e41d453e08da9623bc506188ffe6016255c278e2aa0fb2dc8ffe38d2515827: Status 404 returned error can't find the container with id 72e41d453e08da9623bc506188ffe6016255c278e2aa0fb2dc8ffe38d2515827 Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.078493 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"fc6c3c7a-374b-49fc-98d5-852785c56ee7","Type":"ContainerStarted","Data":"697b34c22592b7300fcfa008f6af9269cc58d1392d91cf6428dfd278683fd31d"} Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.079064 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"fc6c3c7a-374b-49fc-98d5-852785c56ee7","Type":"ContainerStarted","Data":"72e41d453e08da9623bc506188ffe6016255c278e2aa0fb2dc8ffe38d2515827"} Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.088911 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"29ad14f6-de76-4992-b46a-29f0822654c7","Type":"ContainerStarted","Data":"02c2e4acc5f6ea8d4625a64c6e75382a69f1aaa6a4719c07c6f25196ceeb481d"} Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.099620 5094 generic.go:334] "Generic (PLEG): container finished" podID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerID="4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1" exitCode=0 Feb 20 08:54:18 crc kubenswrapper[5094]: I0220 08:54:18.099694 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerDied","Data":"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.109909 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerStarted","Data":"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.110467 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.111558 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"fc6c3c7a-374b-49fc-98d5-852785c56ee7","Type":"ContainerStarted","Data":"b6e4e6054b823428e4103a0b80aa4680f017b35ffcd5ff94e955860a002c75ba"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.111698 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.116775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerStarted","Data":"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.119181 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"29ad14f6-de76-4992-b46a-29f0822654c7","Type":"ContainerStarted","Data":"f21b439125592db20581223d0556db95ff4be1555bdefe40714a2539f53b8b67"} Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.131112 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" podStartSLOduration=4.1310913639999995 podStartE2EDuration="4.131091364s" podCreationTimestamp="2026-02-20 08:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:54:19.128568773 +0000 UTC m=+7674.001195484" watchObservedRunningTime="2026-02-20 08:54:19.131091364 +0000 UTC m=+7674.003718075" Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.150666 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.150647195 podStartE2EDuration="4.150647195s" podCreationTimestamp="2026-02-20 08:54:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:54:19.148250597 +0000 UTC m=+7674.020877308" watchObservedRunningTime="2026-02-20 08:54:19.150647195 +0000 UTC m=+7674.023273906" Feb 20 08:54:19 crc kubenswrapper[5094]: I0220 08:54:19.170318 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zrlvp" podStartSLOduration=3.507867857 podStartE2EDuration="8.170303317s" podCreationTimestamp="2026-02-20 08:54:11 +0000 UTC" firstStartedPulling="2026-02-20 08:54:12.966455333 +0000 UTC m=+7667.839082034" lastFinishedPulling="2026-02-20 08:54:17.628890783 +0000 UTC m=+7672.501517494" observedRunningTime="2026-02-20 08:54:19.164295773 +0000 UTC m=+7674.036922484" watchObservedRunningTime="2026-02-20 08:54:19.170303317 +0000 UTC m=+7674.042930028" Feb 20 08:54:22 crc kubenswrapper[5094]: I0220 08:54:22.149526 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:22 crc kubenswrapper[5094]: I0220 08:54:22.150140 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:22 crc kubenswrapper[5094]: I0220 08:54:22.156866 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 08:54:22 crc kubenswrapper[5094]: I0220 08:54:22.188051 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=6.747042442 podStartE2EDuration="7.188032449s" podCreationTimestamp="2026-02-20 08:54:15 +0000 UTC" firstStartedPulling="2026-02-20 08:54:16.723260101 +0000 UTC m=+7671.595886812" lastFinishedPulling="2026-02-20 08:54:17.164250108 +0000 UTC m=+7672.036876819" observedRunningTime="2026-02-20 08:54:19.188029134 +0000 UTC m=+7674.060655845" watchObservedRunningTime="2026-02-20 08:54:22.188032449 +0000 UTC m=+7677.060659160" Feb 20 08:54:23 crc kubenswrapper[5094]: I0220 08:54:23.212170 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-zrlvp" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" probeResult="failure" output=< Feb 20 08:54:23 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 08:54:23 crc kubenswrapper[5094]: > Feb 20 08:54:25 crc kubenswrapper[5094]: I0220 08:54:25.796860 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Feb 20 08:54:25 crc kubenswrapper[5094]: I0220 08:54:25.894930 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.002755 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.002980 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="dnsmasq-dns" containerID="cri-o://08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a" gracePeriod=10 Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.235236 5094 generic.go:334] "Generic (PLEG): container finished" podID="4909c4ac-65fa-412c-990d-974868b0f104" containerID="08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a" exitCode=0 Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.235322 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerDied","Data":"08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a"} Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.590045 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.674972 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.675076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.675167 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.675200 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.675229 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") pod \"4909c4ac-65fa-412c-990d-974868b0f104\" (UID: \"4909c4ac-65fa-412c-990d-974868b0f104\") " Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.682353 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28" (OuterVolumeSpecName: "kube-api-access-sjv28") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "kube-api-access-sjv28". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.728689 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config" (OuterVolumeSpecName: "config") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.729161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.732131 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.744199 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4909c4ac-65fa-412c-990d-974868b0f104" (UID: "4909c4ac-65fa-412c-990d-974868b0f104"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777560 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777603 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777621 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjv28\" (UniqueName: \"kubernetes.io/projected/4909c4ac-65fa-412c-990d-974868b0f104-kube-api-access-sjv28\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777631 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:26 crc kubenswrapper[5094]: I0220 08:54:26.777640 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4909c4ac-65fa-412c-990d-974868b0f104-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.245382 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" event={"ID":"4909c4ac-65fa-412c-990d-974868b0f104","Type":"ContainerDied","Data":"8084f28745ebf13a7935e0af610ee153d1789476c3995420facfd289029eaab4"} Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.245396 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-644b874bd7-7xjnn" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.245746 5094 scope.go:117] "RemoveContainer" containerID="08c36e9a1d8ff13d6dc18d5d5c0ee6433ecb5624744ee3a3486155201c35db6a" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.247334 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b","Type":"ContainerStarted","Data":"da55c28e144015c61d6361855066371e38dc4ac0636485764059023ff3e5347c"} Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.247360 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b","Type":"ContainerStarted","Data":"7136c5df75041d4d9854b7cef7d18bb2ad0c2f3f8413bda279a9327c80d8f1b6"} Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.273288 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.030862391 podStartE2EDuration="12.273269129s" podCreationTimestamp="2026-02-20 08:54:15 +0000 UTC" firstStartedPulling="2026-02-20 08:54:16.811021352 +0000 UTC m=+7671.683648063" lastFinishedPulling="2026-02-20 08:54:26.05342809 +0000 UTC m=+7680.926054801" observedRunningTime="2026-02-20 08:54:27.265777789 +0000 UTC m=+7682.138404500" watchObservedRunningTime="2026-02-20 08:54:27.273269129 +0000 UTC m=+7682.145895840" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.274069 5094 scope.go:117] "RemoveContainer" containerID="f5767cd62a5a9e26fc88ffbe25eb74c9c4932ee6d1de8eb39356b77614dedec0" Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.297799 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.305582 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-644b874bd7-7xjnn"] Feb 20 08:54:27 crc kubenswrapper[5094]: I0220 08:54:27.851091 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4909c4ac-65fa-412c-990d-974868b0f104" path="/var/lib/kubelet/pods/4909c4ac-65fa-412c-990d-974868b0f104/volumes" Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.408369 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.408937 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-central-agent" containerID="cri-o://6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1" gracePeriod=30 Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.409008 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-notification-agent" containerID="cri-o://da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51" gracePeriod=30 Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.408998 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="proxy-httpd" containerID="cri-o://68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec" gracePeriod=30 Feb 20 08:54:28 crc kubenswrapper[5094]: I0220 08:54:28.409028 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="sg-core" containerID="cri-o://63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164" gracePeriod=30 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.291970 5094 generic.go:334] "Generic (PLEG): container finished" podID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerID="68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec" exitCode=0 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292276 5094 generic.go:334] "Generic (PLEG): container finished" podID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerID="63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164" exitCode=2 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292286 5094 generic.go:334] "Generic (PLEG): container finished" podID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerID="da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51" exitCode=0 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292057 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec"} Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292328 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164"} Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292341 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51"} Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292351 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1"} Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.292295 5094 generic.go:334] "Generic (PLEG): container finished" podID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerID="6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1" exitCode=0 Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.630869 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738025 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738112 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738222 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738250 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738278 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738359 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.738409 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") pod \"4faa0d21-cfca-4eae-a05a-3ec287395c30\" (UID: \"4faa0d21-cfca-4eae-a05a-3ec287395c30\") " Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.739060 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.739413 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.743970 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts" (OuterVolumeSpecName: "scripts") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.744140 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf" (OuterVolumeSpecName: "kube-api-access-lg2cf") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "kube-api-access-lg2cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.771943 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841005 5094 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-scripts\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841233 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg2cf\" (UniqueName: \"kubernetes.io/projected/4faa0d21-cfca-4eae-a05a-3ec287395c30-kube-api-access-lg2cf\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841569 5094 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841738 5094 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841759 5094 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4faa0d21-cfca-4eae-a05a-3ec287395c30-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.841986 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.860682 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data" (OuterVolumeSpecName: "config-data") pod "4faa0d21-cfca-4eae-a05a-3ec287395c30" (UID: "4faa0d21-cfca-4eae-a05a-3ec287395c30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.944096 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:29 crc kubenswrapper[5094]: I0220 08:54:29.944125 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4faa0d21-cfca-4eae-a05a-3ec287395c30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.303640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4faa0d21-cfca-4eae-a05a-3ec287395c30","Type":"ContainerDied","Data":"4282bf0047b56410178151ee0ba808962a8b92244967608282436fe0dae092a8"} Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.303712 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.303705 5094 scope.go:117] "RemoveContainer" containerID="68eb7f14cec84db171cf8515f584e0954ccbe005bb13c77ec56c3620e8c5a7ec" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.337772 5094 scope.go:117] "RemoveContainer" containerID="63388c3e7e7eb77a160e75b227ead6c24b91ba4a2895c59cd363b4b43675b164" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.340850 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.358056 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.372447 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.372979 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="init" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373004 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="init" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373025 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-central-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373032 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-central-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373067 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="proxy-httpd" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373076 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="proxy-httpd" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373084 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="sg-core" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373090 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="sg-core" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373105 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-notification-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373113 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-notification-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: E0220 08:54:30.373151 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="dnsmasq-dns" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373159 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="dnsmasq-dns" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373369 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4909c4ac-65fa-412c-990d-974868b0f104" containerName="dnsmasq-dns" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373381 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="sg-core" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373396 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-central-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373405 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="ceilometer-notification-agent" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.373418 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" containerName="proxy-httpd" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.375307 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.381156 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.384065 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.384172 5094 scope.go:117] "RemoveContainer" containerID="da457492d3fcb44eb3db9f843668f654d08851b042c34f3918cf3c58cc836b51" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.384309 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.417648 5094 scope.go:117] "RemoveContainer" containerID="6f7c3a64e5e40bcc8a0965c47b2906e4a7a28eb59209f746785649c8a13d38c1" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.452573 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-config-data\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.452654 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-run-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.452772 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m78lk\" (UniqueName: \"kubernetes.io/projected/2f751c26-9b9c-4a25-a388-cc52b0934ab6-kube-api-access-m78lk\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.453047 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-log-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.453079 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.453278 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-scripts\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.453364 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555312 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555387 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-config-data\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555459 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-run-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555501 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m78lk\" (UniqueName: \"kubernetes.io/projected/2f751c26-9b9c-4a25-a388-cc52b0934ab6-kube-api-access-m78lk\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555638 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-log-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555682 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-scripts\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.555977 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-run-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.556198 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2f751c26-9b9c-4a25-a388-cc52b0934ab6-log-httpd\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.559428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.561387 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-scripts\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.562007 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.562605 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f751c26-9b9c-4a25-a388-cc52b0934ab6-config-data\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.580026 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m78lk\" (UniqueName: \"kubernetes.io/projected/2f751c26-9b9c-4a25-a388-cc52b0934ab6-kube-api-access-m78lk\") pod \"ceilometer-0\" (UID: \"2f751c26-9b9c-4a25-a388-cc52b0934ab6\") " pod="openstack/ceilometer-0" Feb 20 08:54:30 crc kubenswrapper[5094]: I0220 08:54:30.693750 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 20 08:54:31 crc kubenswrapper[5094]: W0220 08:54:31.220156 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f751c26_9b9c_4a25_a388_cc52b0934ab6.slice/crio-f51cfa853f9b360febacfa1c6dcf6a12af9a666bc15b73ca70a6c801dfb5c241 WatchSource:0}: Error finding container f51cfa853f9b360febacfa1c6dcf6a12af9a666bc15b73ca70a6c801dfb5c241: Status 404 returned error can't find the container with id f51cfa853f9b360febacfa1c6dcf6a12af9a666bc15b73ca70a6c801dfb5c241 Feb 20 08:54:31 crc kubenswrapper[5094]: I0220 08:54:31.222411 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 20 08:54:31 crc kubenswrapper[5094]: I0220 08:54:31.324628 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"f51cfa853f9b360febacfa1c6dcf6a12af9a666bc15b73ca70a6c801dfb5c241"} Feb 20 08:54:31 crc kubenswrapper[5094]: I0220 08:54:31.853279 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4faa0d21-cfca-4eae-a05a-3ec287395c30" path="/var/lib/kubelet/pods/4faa0d21-cfca-4eae-a05a-3ec287395c30/volumes" Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.198069 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.255151 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.339381 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"fa85623f0dc63071a2c04c1a12c55922711803a3d0d7da69877462774dd83d87"} Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.339426 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"a334e0060f193223f9416b979637d2c6744c09e5e7365cbe9cad13dc86ce4a73"} Feb 20 08:54:32 crc kubenswrapper[5094]: I0220 08:54:32.438467 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:33 crc kubenswrapper[5094]: I0220 08:54:33.350314 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"d8ed66e59ddc31f00fbe729caf0c99ab348bd906da78040a6607314a54f0742c"} Feb 20 08:54:33 crc kubenswrapper[5094]: I0220 08:54:33.350450 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zrlvp" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" containerID="cri-o://03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" gracePeriod=2 Feb 20 08:54:33 crc kubenswrapper[5094]: I0220 08:54:33.887426 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.030522 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") pod \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.030614 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") pod \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.030654 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") pod \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\" (UID: \"7f62335a-f07f-45c0-9db2-5fbb91ed2588\") " Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.031571 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities" (OuterVolumeSpecName: "utilities") pod "7f62335a-f07f-45c0-9db2-5fbb91ed2588" (UID: "7f62335a-f07f-45c0-9db2-5fbb91ed2588"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.052670 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n" (OuterVolumeSpecName: "kube-api-access-jll2n") pod "7f62335a-f07f-45c0-9db2-5fbb91ed2588" (UID: "7f62335a-f07f-45c0-9db2-5fbb91ed2588"). InnerVolumeSpecName "kube-api-access-jll2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.100381 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f62335a-f07f-45c0-9db2-5fbb91ed2588" (UID: "7f62335a-f07f-45c0-9db2-5fbb91ed2588"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.107575 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.107631 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.133383 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.133447 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f62335a-f07f-45c0-9db2-5fbb91ed2588-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.133461 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jll2n\" (UniqueName: \"kubernetes.io/projected/7f62335a-f07f-45c0-9db2-5fbb91ed2588-kube-api-access-jll2n\") on node \"crc\" DevicePath \"\"" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362290 5094 generic.go:334] "Generic (PLEG): container finished" podID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerID="03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" exitCode=0 Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362342 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerDied","Data":"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35"} Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362392 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zrlvp" event={"ID":"7f62335a-f07f-45c0-9db2-5fbb91ed2588","Type":"ContainerDied","Data":"d5f59bb757af9e7257cafd6f92f50a0f4fa521fad3dffe794d557e3c053860eb"} Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362409 5094 scope.go:117] "RemoveContainer" containerID="03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.362572 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zrlvp" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.406184 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.411262 5094 scope.go:117] "RemoveContainer" containerID="e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.419263 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zrlvp"] Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.537764 5094 scope.go:117] "RemoveContainer" containerID="5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.560848 5094 scope.go:117] "RemoveContainer" containerID="03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" Feb 20 08:54:34 crc kubenswrapper[5094]: E0220 08:54:34.561256 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35\": container with ID starting with 03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35 not found: ID does not exist" containerID="03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561285 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35"} err="failed to get container status \"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35\": rpc error: code = NotFound desc = could not find container \"03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35\": container with ID starting with 03beb55f3b7a57c08ea38dda78b9f937a49b96f4577557bb63adad7f81d02e35 not found: ID does not exist" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561304 5094 scope.go:117] "RemoveContainer" containerID="e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f" Feb 20 08:54:34 crc kubenswrapper[5094]: E0220 08:54:34.561605 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f\": container with ID starting with e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f not found: ID does not exist" containerID="e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561625 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f"} err="failed to get container status \"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f\": rpc error: code = NotFound desc = could not find container \"e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f\": container with ID starting with e84829db4bf3833d220042d25b4548026fdd5015eed418466ddd3615e038e54f not found: ID does not exist" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561637 5094 scope.go:117] "RemoveContainer" containerID="5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966" Feb 20 08:54:34 crc kubenswrapper[5094]: E0220 08:54:34.561897 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966\": container with ID starting with 5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966 not found: ID does not exist" containerID="5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966" Feb 20 08:54:34 crc kubenswrapper[5094]: I0220 08:54:34.561947 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966"} err="failed to get container status \"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966\": rpc error: code = NotFound desc = could not find container \"5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966\": container with ID starting with 5295dd5a17ff5868f1250b18a2e2107bdc0361a941732b23ea2dd7611017a966 not found: ID does not exist" Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.376182 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2f751c26-9b9c-4a25-a388-cc52b0934ab6","Type":"ContainerStarted","Data":"0bdddbfa20ccd1fd60f8dd4da4d1b072906e9ab932c9fbc6487a0462f0ae21e4"} Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.377595 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.414020 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.215743126 podStartE2EDuration="5.4139993s" podCreationTimestamp="2026-02-20 08:54:30 +0000 UTC" firstStartedPulling="2026-02-20 08:54:31.224858033 +0000 UTC m=+7686.097484744" lastFinishedPulling="2026-02-20 08:54:34.423114207 +0000 UTC m=+7689.295740918" observedRunningTime="2026-02-20 08:54:35.398120207 +0000 UTC m=+7690.270746918" watchObservedRunningTime="2026-02-20 08:54:35.4139993 +0000 UTC m=+7690.286626011" Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.785683 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Feb 20 08:54:35 crc kubenswrapper[5094]: I0220 08:54:35.851411 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" path="/var/lib/kubelet/pods/7f62335a-f07f-45c0-9db2-5fbb91ed2588/volumes" Feb 20 08:54:37 crc kubenswrapper[5094]: I0220 08:54:37.373616 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Feb 20 08:54:37 crc kubenswrapper[5094]: I0220 08:54:37.693459 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Feb 20 08:54:41 crc kubenswrapper[5094]: I0220 08:54:41.620276 5094 scope.go:117] "RemoveContainer" containerID="3996b425ffaed37f9d76c6d868da173f9ea735a39c30061d9efa5a940e6f7333" Feb 20 08:54:41 crc kubenswrapper[5094]: I0220 08:54:41.660770 5094 scope.go:117] "RemoveContainer" containerID="9b130573754c208a821c2a5aa00744abfcde1ec2f224d985ae00e81ebcaa218e" Feb 20 08:54:41 crc kubenswrapper[5094]: I0220 08:54:41.701203 5094 scope.go:117] "RemoveContainer" containerID="8dfc18891e7f2cecc2e704cc07266d7a47a98f1dcf9f167194c7d37d347b850e" Feb 20 08:54:47 crc kubenswrapper[5094]: I0220 08:54:47.356917 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.055074 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.073436 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.082913 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-437f-account-create-update-cc6d4"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.091829 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.101009 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.109825 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-qwnhp"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.118623 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-q8cvq"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.127277 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-vjfql"] Feb 20 08:55:00 crc kubenswrapper[5094]: I0220 08:55:00.700260 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.032521 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.041023 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.049823 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1c22-account-create-update-wl44h"] Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.058419 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-9129-account-create-update-kcqsj"] Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.850277 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab2f8a8-e11c-4b13-a12f-7006756e4d56" path="/var/lib/kubelet/pods/2ab2f8a8-e11c-4b13-a12f-7006756e4d56/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.851128 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33e893dc-597d-4b0d-b59d-04c636d58ce4" path="/var/lib/kubelet/pods/33e893dc-597d-4b0d-b59d-04c636d58ce4/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.851711 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62afc590-4a32-45a1-b7e9-bde09c7f0b6a" path="/var/lib/kubelet/pods/62afc590-4a32-45a1-b7e9-bde09c7f0b6a/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.852352 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90396e9c-2602-41dd-92c3-da38bb5f7be7" path="/var/lib/kubelet/pods/90396e9c-2602-41dd-92c3-da38bb5f7be7/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.853409 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95274e98-2b48-4b4d-b0c5-5dedafedc43f" path="/var/lib/kubelet/pods/95274e98-2b48-4b4d-b0c5-5dedafedc43f/volumes" Feb 20 08:55:01 crc kubenswrapper[5094]: I0220 08:55:01.854060 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e24ca1b9-7440-432c-a0eb-58a17f83a8ee" path="/var/lib/kubelet/pods/e24ca1b9-7440-432c-a0eb-58a17f83a8ee/volumes" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.107125 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.107540 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.107594 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.108692 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.108787 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" gracePeriod=600 Feb 20 08:55:04 crc kubenswrapper[5094]: E0220 08:55:04.238235 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.679450 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" exitCode=0 Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.679500 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c"} Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.679541 5094 scope.go:117] "RemoveContainer" containerID="e8403acee0c31e397e6e5d741268e12e6725d137dadeeb1b9238f72fcf352268" Feb 20 08:55:04 crc kubenswrapper[5094]: I0220 08:55:04.681380 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:04 crc kubenswrapper[5094]: E0220 08:55:04.681847 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:18 crc kubenswrapper[5094]: I0220 08:55:18.037080 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:55:18 crc kubenswrapper[5094]: I0220 08:55:18.051313 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mkwlr"] Feb 20 08:55:19 crc kubenswrapper[5094]: I0220 08:55:19.840693 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:19 crc kubenswrapper[5094]: E0220 08:55:19.841260 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:19 crc kubenswrapper[5094]: I0220 08:55:19.851407 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91" path="/var/lib/kubelet/pods/c8cc89fb-1ef5-4f62-afbc-a06a1d75fa91/volumes" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.111415 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:29 crc kubenswrapper[5094]: E0220 08:55:29.112348 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="extract-utilities" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.112361 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="extract-utilities" Feb 20 08:55:29 crc kubenswrapper[5094]: E0220 08:55:29.112387 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="extract-content" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.112393 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="extract-content" Feb 20 08:55:29 crc kubenswrapper[5094]: E0220 08:55:29.112407 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.112413 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.112621 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f62335a-f07f-45c0-9db2-5fbb91ed2588" containerName="registry-server" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.113920 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.118417 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.127881 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.218374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.218757 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.218875 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.218931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.219089 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.219119 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321626 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321736 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321879 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.321957 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.322755 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.322765 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.323324 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.323481 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.323973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.340495 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") pod \"dnsmasq-dns-55d7f5f657-jsw6v\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:29 crc kubenswrapper[5094]: I0220 08:55:29.431312 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:30 crc kubenswrapper[5094]: I0220 08:55:29.909851 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:30 crc kubenswrapper[5094]: W0220 08:55:29.911122 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ef4d7b9_7970_4336_859e_08e2a4820524.slice/crio-366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6 WatchSource:0}: Error finding container 366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6: Status 404 returned error can't find the container with id 366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6 Feb 20 08:55:30 crc kubenswrapper[5094]: I0220 08:55:29.974857 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerStarted","Data":"366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6"} Feb 20 08:55:30 crc kubenswrapper[5094]: I0220 08:55:30.840468 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:30 crc kubenswrapper[5094]: E0220 08:55:30.841289 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:31 crc kubenswrapper[5094]: I0220 08:55:31.028827 5094 generic.go:334] "Generic (PLEG): container finished" podID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerID="d0d02e83461148d891a35888220f3b22b91ec3fc7a98679a44ab912f988c01ed" exitCode=0 Feb 20 08:55:31 crc kubenswrapper[5094]: I0220 08:55:31.028878 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerDied","Data":"d0d02e83461148d891a35888220f3b22b91ec3fc7a98679a44ab912f988c01ed"} Feb 20 08:55:32 crc kubenswrapper[5094]: I0220 08:55:32.038858 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerStarted","Data":"9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d"} Feb 20 08:55:32 crc kubenswrapper[5094]: I0220 08:55:32.039205 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:32 crc kubenswrapper[5094]: I0220 08:55:32.064159 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" podStartSLOduration=3.064138271 podStartE2EDuration="3.064138271s" podCreationTimestamp="2026-02-20 08:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:55:32.058810733 +0000 UTC m=+7746.931437444" watchObservedRunningTime="2026-02-20 08:55:32.064138271 +0000 UTC m=+7746.936764982" Feb 20 08:55:36 crc kubenswrapper[5094]: I0220 08:55:36.059897 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:55:36 crc kubenswrapper[5094]: I0220 08:55:36.074187 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cwpxs"] Feb 20 08:55:37 crc kubenswrapper[5094]: I0220 08:55:37.038243 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:55:37 crc kubenswrapper[5094]: I0220 08:55:37.051884 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-62kch"] Feb 20 08:55:37 crc kubenswrapper[5094]: I0220 08:55:37.861328 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077dc649-6898-4f04-837d-b694decf612b" path="/var/lib/kubelet/pods/077dc649-6898-4f04-837d-b694decf612b/volumes" Feb 20 08:55:37 crc kubenswrapper[5094]: I0220 08:55:37.862755 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3" path="/var/lib/kubelet/pods/2b9778d8-4ef4-4dc7-b201-77a2f5e4a3b3/volumes" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.433393 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.491237 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.491462 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="dnsmasq-dns" containerID="cri-o://355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" gracePeriod=10 Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.664606 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.681228 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.682015 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767113 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767481 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767543 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767567 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.767623 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.769160 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872405 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872480 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872517 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872567 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872590 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.872636 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.873471 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.873491 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.873669 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.874169 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.874193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:39 crc kubenswrapper[5094]: I0220 08:55:39.916390 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") pod \"dnsmasq-dns-6d74d874c7-2ztxv\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.043778 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.124262 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.160537 5094 generic.go:334] "Generic (PLEG): container finished" podID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerID="355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" exitCode=0 Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.160935 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerDied","Data":"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9"} Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.160981 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" event={"ID":"d0e42f57-46ec-4ef5-a4f0-f262ce003602","Type":"ContainerDied","Data":"0a945c3e150afe6769cf7a5b87b40b125560d57829dfe7b7047e2bf325ea9a2b"} Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.161003 5094 scope.go:117] "RemoveContainer" containerID="355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.160956 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58b99f595-hbwr4" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178066 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178200 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178244 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178287 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.178357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") pod \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\" (UID: \"d0e42f57-46ec-4ef5-a4f0-f262ce003602\") " Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.183825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6" (OuterVolumeSpecName: "kube-api-access-lxvk6") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "kube-api-access-lxvk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.191070 5094 scope.go:117] "RemoveContainer" containerID="4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.217296 5094 scope.go:117] "RemoveContainer" containerID="355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" Feb 20 08:55:40 crc kubenswrapper[5094]: E0220 08:55:40.218741 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9\": container with ID starting with 355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9 not found: ID does not exist" containerID="355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.218787 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9"} err="failed to get container status \"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9\": rpc error: code = NotFound desc = could not find container \"355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9\": container with ID starting with 355d244659b681399735fcdacac24c858003f7b54d0a83a0b5de3821ffa260e9 not found: ID does not exist" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.218816 5094 scope.go:117] "RemoveContainer" containerID="4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1" Feb 20 08:55:40 crc kubenswrapper[5094]: E0220 08:55:40.219180 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1\": container with ID starting with 4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1 not found: ID does not exist" containerID="4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.219209 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1"} err="failed to get container status \"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1\": rpc error: code = NotFound desc = could not find container \"4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1\": container with ID starting with 4a2f6d7a9fa1436f2f214f0a9fae369a7481837477a3433e0f286a8f3d99cdc1 not found: ID does not exist" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.243143 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.248347 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.276606 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.279878 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config" (OuterVolumeSpecName: "config") pod "d0e42f57-46ec-4ef5-a4f0-f262ce003602" (UID: "d0e42f57-46ec-4ef5-a4f0-f262ce003602"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280124 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280151 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280163 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280172 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0e42f57-46ec-4ef5-a4f0-f262ce003602-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.280183 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxvk6\" (UniqueName: \"kubernetes.io/projected/d0e42f57-46ec-4ef5-a4f0-f262ce003602-kube-api-access-lxvk6\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.498832 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.508507 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58b99f595-hbwr4"] Feb 20 08:55:40 crc kubenswrapper[5094]: W0220 08:55:40.527092 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31655ca7_8da2_4f24_a0c6_49ba0de9d207.slice/crio-b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a WatchSource:0}: Error finding container b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a: Status 404 returned error can't find the container with id b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a Feb 20 08:55:40 crc kubenswrapper[5094]: I0220 08:55:40.529635 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.172013 5094 generic.go:334] "Generic (PLEG): container finished" podID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerID="50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f" exitCode=0 Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.172088 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerDied","Data":"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f"} Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.172155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerStarted","Data":"b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a"} Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.849951 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" path="/var/lib/kubelet/pods/d0e42f57-46ec-4ef5-a4f0-f262ce003602/volumes" Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.857939 5094 scope.go:117] "RemoveContainer" containerID="a81d372de4e3c666bafe4f07d24f6f7cc4bee3b83fd66c5127214476a5ecb65e" Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.901983 5094 scope.go:117] "RemoveContainer" containerID="20cb7da637ab28cd5dddb0edb00930b7379bd84765eae45228fa5efc43d1c866" Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.953871 5094 scope.go:117] "RemoveContainer" containerID="18a0fc5e2df223fc2c55d1f16cc18e7c5d24a21c6534f46e5e010adfb875a921" Feb 20 08:55:41 crc kubenswrapper[5094]: I0220 08:55:41.981941 5094 scope.go:117] "RemoveContainer" containerID="cc26980783bd1f2aa717556354f0e4a6d1d7792f5dd61a8146cad63d1f649ba5" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.039794 5094 scope.go:117] "RemoveContainer" containerID="7d035de1d36dadcbc2b1699a2d04fbaf8dc66a5156f2934f3122a703497829c7" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.064918 5094 scope.go:117] "RemoveContainer" containerID="b22a4c98fab8bd430cea1082edfc23c911f8d32bd3adc55526aec0a42c5684bd" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.107187 5094 scope.go:117] "RemoveContainer" containerID="3f997facacb6313e0f115f2a2227ee22f54c84973cc33a1b4f4cc4cd0e2df3df" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.128003 5094 scope.go:117] "RemoveContainer" containerID="814c099e47197bd5868d74e553deb48d652a97c496a27496d27f367ca0750674" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.152245 5094 scope.go:117] "RemoveContainer" containerID="0d114a7c88828f83388e2e035f175ac9a3e4b92dd7429d32fee56582784e51b6" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.179345 5094 scope.go:117] "RemoveContainer" containerID="4c1355ff4c58f3d8da61df01ab4f55870452a5c2ce565bb148c221041dd245f6" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.190184 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerStarted","Data":"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1"} Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.190292 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.212269 5094 scope.go:117] "RemoveContainer" containerID="a253a0cdc59df09716156eda440d5655d328fc7850d293b9093cf8b148e34b46" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.215850 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" podStartSLOduration=3.215832858 podStartE2EDuration="3.215832858s" podCreationTimestamp="2026-02-20 08:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:55:42.20717235 +0000 UTC m=+7757.079799061" watchObservedRunningTime="2026-02-20 08:55:42.215832858 +0000 UTC m=+7757.088459569" Feb 20 08:55:42 crc kubenswrapper[5094]: I0220 08:55:42.840435 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:42 crc kubenswrapper[5094]: E0220 08:55:42.840910 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.044836 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.123751 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.124046 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="dnsmasq-dns" containerID="cri-o://9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d" gracePeriod=10 Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.311632 5094 generic.go:334] "Generic (PLEG): container finished" podID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerID="9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d" exitCode=0 Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.312355 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerDied","Data":"9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d"} Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.362559 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8479b9d65f-4mzqh"] Feb 20 08:55:50 crc kubenswrapper[5094]: E0220 08:55:50.363010 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="dnsmasq-dns" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.363026 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="dnsmasq-dns" Feb 20 08:55:50 crc kubenswrapper[5094]: E0220 08:55:50.363058 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="init" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.363065 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="init" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.363254 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0e42f57-46ec-4ef5-a4f0-f262ce003602" containerName="dnsmasq-dns" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.364435 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.368390 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-networker" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.373642 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8479b9d65f-4mzqh"] Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.429853 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-dns-svc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.429902 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-sb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.429986 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-config\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.430008 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-nb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.430060 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-cell1\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.430095 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-networker\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.430128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl9xc\" (UniqueName: \"kubernetes.io/projected/bee88947-a5ae-4438-9283-a3fc34fde9e4-kube-api-access-zl9xc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.534187 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-networker\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.534890 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl9xc\" (UniqueName: \"kubernetes.io/projected/bee88947-a5ae-4438-9283-a3fc34fde9e4-kube-api-access-zl9xc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535107 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-dns-svc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-sb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535345 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-config\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535373 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-nb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.535586 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-cell1\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.537751 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-config\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.537763 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-dns-svc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.537842 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-sb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.538177 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-ovsdbserver-nb\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.538617 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-networker\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-networker\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.546856 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/bee88947-a5ae-4438-9283-a3fc34fde9e4-openstack-cell1\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.567006 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl9xc\" (UniqueName: \"kubernetes.io/projected/bee88947-a5ae-4438-9283-a3fc34fde9e4-kube-api-access-zl9xc\") pod \"dnsmasq-dns-8479b9d65f-4mzqh\" (UID: \"bee88947-a5ae-4438-9283-a3fc34fde9e4\") " pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.664997 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.700499 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.738937 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.739873 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.740051 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.740132 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.740161 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.740182 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") pod \"7ef4d7b9-7970-4336-859e-08e2a4820524\" (UID: \"7ef4d7b9-7970-4336-859e-08e2a4820524\") " Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.750987 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8" (OuterVolumeSpecName: "kube-api-access-spbf8") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "kube-api-access-spbf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.806902 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.809157 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.810407 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config" (OuterVolumeSpecName: "config") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.826637 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.835390 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "7ef4d7b9-7970-4336-859e-08e2a4820524" (UID: "7ef4d7b9-7970-4336-859e-08e2a4820524"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.845243 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.845946 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.845970 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.845988 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spbf8\" (UniqueName: \"kubernetes.io/projected/7ef4d7b9-7970-4336-859e-08e2a4820524-kube-api-access-spbf8\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.846000 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:50 crc kubenswrapper[5094]: I0220 08:55:50.846008 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/7ef4d7b9-7970-4336-859e-08e2a4820524-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.076988 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.088554 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-5cthc"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.232388 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8479b9d65f-4mzqh"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.331405 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" event={"ID":"7ef4d7b9-7970-4336-859e-08e2a4820524","Type":"ContainerDied","Data":"366cdb507f5b3b52436869451d51fb6e8c589ab86c365a15488bd4c24a612eb6"} Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.331460 5094 scope.go:117] "RemoveContainer" containerID="9d0e86403207be00e168f6826a234cf2f0c49931b532c3aeb9feb705b3e0e69d" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.331625 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55d7f5f657-jsw6v" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.334733 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" event={"ID":"bee88947-a5ae-4438-9283-a3fc34fde9e4","Type":"ContainerStarted","Data":"c6e5f29abf3edeb7e7f3059a5c05f5d2fb0bc1754c58a00813e1efd58de28903"} Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.427862 5094 scope.go:117] "RemoveContainer" containerID="d0d02e83461148d891a35888220f3b22b91ec3fc7a98679a44ab912f988c01ed" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.463214 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.473291 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55d7f5f657-jsw6v"] Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.851827 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f4c643-fb8b-41d6-97b5-fa0c0928f370" path="/var/lib/kubelet/pods/47f4c643-fb8b-41d6-97b5-fa0c0928f370/volumes" Feb 20 08:55:51 crc kubenswrapper[5094]: I0220 08:55:51.852483 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" path="/var/lib/kubelet/pods/7ef4d7b9-7970-4336-859e-08e2a4820524/volumes" Feb 20 08:55:52 crc kubenswrapper[5094]: I0220 08:55:52.346399 5094 generic.go:334] "Generic (PLEG): container finished" podID="bee88947-a5ae-4438-9283-a3fc34fde9e4" containerID="daf047feda2b65bc85c2d0be5690fb87de0d553533044e35731527c9a067908e" exitCode=0 Feb 20 08:55:52 crc kubenswrapper[5094]: I0220 08:55:52.347063 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" event={"ID":"bee88947-a5ae-4438-9283-a3fc34fde9e4","Type":"ContainerDied","Data":"daf047feda2b65bc85c2d0be5690fb87de0d553533044e35731527c9a067908e"} Feb 20 08:55:53 crc kubenswrapper[5094]: I0220 08:55:53.360520 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" event={"ID":"bee88947-a5ae-4438-9283-a3fc34fde9e4","Type":"ContainerStarted","Data":"f0467c881595e5ec69b0db128e431cd7ecd6a345800ad96608f675ab91569186"} Feb 20 08:55:53 crc kubenswrapper[5094]: I0220 08:55:53.361139 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:55:53 crc kubenswrapper[5094]: I0220 08:55:53.388571 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" podStartSLOduration=3.388553284 podStartE2EDuration="3.388553284s" podCreationTimestamp="2026-02-20 08:55:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 08:55:53.382821906 +0000 UTC m=+7768.255448617" watchObservedRunningTime="2026-02-20 08:55:53.388553284 +0000 UTC m=+7768.261179995" Feb 20 08:55:56 crc kubenswrapper[5094]: I0220 08:55:56.840628 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:55:56 crc kubenswrapper[5094]: E0220 08:55:56.841842 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:00 crc kubenswrapper[5094]: I0220 08:56:00.702764 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8479b9d65f-4mzqh" Feb 20 08:56:00 crc kubenswrapper[5094]: I0220 08:56:00.768978 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:56:00 crc kubenswrapper[5094]: I0220 08:56:00.769254 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="dnsmasq-dns" containerID="cri-o://b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" gracePeriod=10 Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.308505 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377152 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b"] Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.377580 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="init" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377599 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="init" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.377618 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377625 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.377639 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377646 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.377683 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="init" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377688 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="init" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377874 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef4d7b9-7970-4336-859e-08e2a4820524" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.377895 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerName="dnsmasq-dns" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.379122 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.388401 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m"] Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.389589 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.389903 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.390177 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.390228 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.390505 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.392936 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.392989 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.393029 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.393064 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.393118 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.393224 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") pod \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\" (UID: \"31655ca7-8da2-4f24-a0c6-49ba0de9d207\") " Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.404022 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.404289 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.412932 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6" (OuterVolumeSpecName: "kube-api-access-sz5g6") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "kube-api-access-sz5g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.433889 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b"] Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.458895 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m"] Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481608 5094 generic.go:334] "Generic (PLEG): container finished" podID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" containerID="b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" exitCode=0 Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481653 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerDied","Data":"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1"} Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481677 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" event={"ID":"31655ca7-8da2-4f24-a0c6-49ba0de9d207","Type":"ContainerDied","Data":"b8b1d201496df049d7a399da8b383ad8dc5d243c40082e06193cd89fed5c406a"} Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481694 5094 scope.go:117] "RemoveContainer" containerID="b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.481881 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d74d874c7-2ztxv" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.488383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.501825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.519245 5094 scope.go:117] "RemoveContainer" containerID="50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.521240 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.525471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.525673 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.525739 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526046 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526215 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526360 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526429 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526501 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526819 5094 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526837 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sz5g6\" (UniqueName: \"kubernetes.io/projected/31655ca7-8da2-4f24-a0c6-49ba0de9d207-kube-api-access-sz5g6\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.526856 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.542732 5094 scope.go:117] "RemoveContainer" containerID="b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.544466 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1\": container with ID starting with b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1 not found: ID does not exist" containerID="b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.544503 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1"} err="failed to get container status \"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1\": rpc error: code = NotFound desc = could not find container \"b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1\": container with ID starting with b6ffb1c5e46eb4b0a0c85962e06f901a8bf1fc4eb6e8588f57307ad6fa4f9ad1 not found: ID does not exist" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.544526 5094 scope.go:117] "RemoveContainer" containerID="50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f" Feb 20 08:56:01 crc kubenswrapper[5094]: E0220 08:56:01.544796 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f\": container with ID starting with 50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f not found: ID does not exist" containerID="50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.544815 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f"} err="failed to get container status \"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f\": rpc error: code = NotFound desc = could not find container \"50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f\": container with ID starting with 50479b6866f7e99ab149ca475e268e49cdfc9bc8663eef94b4f64022a12bb80f not found: ID does not exist" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.553664 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config" (OuterVolumeSpecName: "config") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.554359 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.563059 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31655ca7-8da2-4f24-a0c6-49ba0de9d207" (UID: "31655ca7-8da2-4f24-a0c6-49ba0de9d207"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628490 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628536 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628641 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628693 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628745 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628770 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628857 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628884 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628940 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628951 5094 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.628963 5094 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31655ca7-8da2-4f24-a0c6-49ba0de9d207-config\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.632050 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.632600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634054 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634101 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634372 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.634873 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.653174 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.655954 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.705221 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.803820 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.832127 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:56:01 crc kubenswrapper[5094]: I0220 08:56:01.865740 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d74d874c7-2ztxv"] Feb 20 08:56:02 crc kubenswrapper[5094]: I0220 08:56:02.290887 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b"] Feb 20 08:56:02 crc kubenswrapper[5094]: I0220 08:56:02.443918 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m"] Feb 20 08:56:02 crc kubenswrapper[5094]: W0220 08:56:02.454756 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee17d13_1b0d_49a2_a515_cc63a2f62c63.slice/crio-a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473 WatchSource:0}: Error finding container a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473: Status 404 returned error can't find the container with id a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473 Feb 20 08:56:02 crc kubenswrapper[5094]: I0220 08:56:02.493914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" event={"ID":"aee17d13-1b0d-49a2-a515-cc63a2f62c63","Type":"ContainerStarted","Data":"a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473"} Feb 20 08:56:02 crc kubenswrapper[5094]: I0220 08:56:02.495225 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" event={"ID":"750a5132-7613-40c0-a360-2f1a589d2554","Type":"ContainerStarted","Data":"8214a023e8d2d42a8772770e282ea397fd96754091592c1ccdaf0cbad0e9d784"} Feb 20 08:56:03 crc kubenswrapper[5094]: I0220 08:56:03.866994 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31655ca7-8da2-4f24-a0c6-49ba0de9d207" path="/var/lib/kubelet/pods/31655ca7-8da2-4f24-a0c6-49ba0de9d207/volumes" Feb 20 08:56:10 crc kubenswrapper[5094]: I0220 08:56:10.840966 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:56:10 crc kubenswrapper[5094]: E0220 08:56:10.841729 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:12 crc kubenswrapper[5094]: I0220 08:56:12.601409 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" event={"ID":"aee17d13-1b0d-49a2-a515-cc63a2f62c63","Type":"ContainerStarted","Data":"3a06af21ed40f27cd0fdfa49cc6ae8b158034daa7a7d6e9dbd4c9377f0b0dbee"} Feb 20 08:56:12 crc kubenswrapper[5094]: I0220 08:56:12.603982 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" event={"ID":"750a5132-7613-40c0-a360-2f1a589d2554","Type":"ContainerStarted","Data":"b0838c70e985c9d6c06753020cfb1fedf883083444d6817034896958cb9b23f1"} Feb 20 08:56:12 crc kubenswrapper[5094]: I0220 08:56:12.629789 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" podStartSLOduration=2.285655248 podStartE2EDuration="11.62976811s" podCreationTimestamp="2026-02-20 08:56:01 +0000 UTC" firstStartedPulling="2026-02-20 08:56:02.457134511 +0000 UTC m=+7777.329761232" lastFinishedPulling="2026-02-20 08:56:11.801247363 +0000 UTC m=+7786.673874094" observedRunningTime="2026-02-20 08:56:12.619922973 +0000 UTC m=+7787.492549704" watchObservedRunningTime="2026-02-20 08:56:12.62976811 +0000 UTC m=+7787.502394831" Feb 20 08:56:12 crc kubenswrapper[5094]: I0220 08:56:12.647401 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" podStartSLOduration=2.158817597 podStartE2EDuration="11.647379804s" podCreationTimestamp="2026-02-20 08:56:01 +0000 UTC" firstStartedPulling="2026-02-20 08:56:02.298576397 +0000 UTC m=+7777.171203108" lastFinishedPulling="2026-02-20 08:56:11.787138604 +0000 UTC m=+7786.659765315" observedRunningTime="2026-02-20 08:56:12.643531111 +0000 UTC m=+7787.516157822" watchObservedRunningTime="2026-02-20 08:56:12.647379804 +0000 UTC m=+7787.520006525" Feb 20 08:56:21 crc kubenswrapper[5094]: I0220 08:56:21.691496 5094 generic.go:334] "Generic (PLEG): container finished" podID="750a5132-7613-40c0-a360-2f1a589d2554" containerID="b0838c70e985c9d6c06753020cfb1fedf883083444d6817034896958cb9b23f1" exitCode=0 Feb 20 08:56:21 crc kubenswrapper[5094]: I0220 08:56:21.691610 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" event={"ID":"750a5132-7613-40c0-a360-2f1a589d2554","Type":"ContainerDied","Data":"b0838c70e985c9d6c06753020cfb1fedf883083444d6817034896958cb9b23f1"} Feb 20 08:56:22 crc kubenswrapper[5094]: I0220 08:56:22.710371 5094 generic.go:334] "Generic (PLEG): container finished" podID="aee17d13-1b0d-49a2-a515-cc63a2f62c63" containerID="3a06af21ed40f27cd0fdfa49cc6ae8b158034daa7a7d6e9dbd4c9377f0b0dbee" exitCode=0 Feb 20 08:56:22 crc kubenswrapper[5094]: I0220 08:56:22.710492 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" event={"ID":"aee17d13-1b0d-49a2-a515-cc63a2f62c63","Type":"ContainerDied","Data":"3a06af21ed40f27cd0fdfa49cc6ae8b158034daa7a7d6e9dbd4c9377f0b0dbee"} Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.193951 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.254743 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") pod \"750a5132-7613-40c0-a360-2f1a589d2554\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.255125 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") pod \"750a5132-7613-40c0-a360-2f1a589d2554\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.255163 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") pod \"750a5132-7613-40c0-a360-2f1a589d2554\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.255204 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") pod \"750a5132-7613-40c0-a360-2f1a589d2554\" (UID: \"750a5132-7613-40c0-a360-2f1a589d2554\") " Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.262335 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "750a5132-7613-40c0-a360-2f1a589d2554" (UID: "750a5132-7613-40c0-a360-2f1a589d2554"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.262522 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj" (OuterVolumeSpecName: "kube-api-access-wqrcj") pod "750a5132-7613-40c0-a360-2f1a589d2554" (UID: "750a5132-7613-40c0-a360-2f1a589d2554"). InnerVolumeSpecName "kube-api-access-wqrcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.285042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory" (OuterVolumeSpecName: "inventory") pod "750a5132-7613-40c0-a360-2f1a589d2554" (UID: "750a5132-7613-40c0-a360-2f1a589d2554"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.287224 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "750a5132-7613-40c0-a360-2f1a589d2554" (UID: "750a5132-7613-40c0-a360-2f1a589d2554"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.357278 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.357331 5094 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.357348 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqrcj\" (UniqueName: \"kubernetes.io/projected/750a5132-7613-40c0-a360-2f1a589d2554-kube-api-access-wqrcj\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.357362 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/750a5132-7613-40c0-a360-2f1a589d2554-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.733804 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.738124 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b" event={"ID":"750a5132-7613-40c0-a360-2f1a589d2554","Type":"ContainerDied","Data":"8214a023e8d2d42a8772770e282ea397fd96754091592c1ccdaf0cbad0e9d784"} Feb 20 08:56:23 crc kubenswrapper[5094]: I0220 08:56:23.738266 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8214a023e8d2d42a8772770e282ea397fd96754091592c1ccdaf0cbad0e9d784" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.191620 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275171 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275270 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275502 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.275621 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") pod \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\" (UID: \"aee17d13-1b0d-49a2-a515-cc63a2f62c63\") " Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.279040 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97" (OuterVolumeSpecName: "kube-api-access-6qt97") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "kube-api-access-6qt97". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.279204 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.280999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph" (OuterVolumeSpecName: "ceph") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.302085 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.303472 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory" (OuterVolumeSpecName: "inventory") pod "aee17d13-1b0d-49a2-a515-cc63a2f62c63" (UID: "aee17d13-1b0d-49a2-a515-cc63a2f62c63"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378301 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378338 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378349 5094 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378362 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aee17d13-1b0d-49a2-a515-cc63a2f62c63-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.378375 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qt97\" (UniqueName: \"kubernetes.io/projected/aee17d13-1b0d-49a2-a515-cc63a2f62c63-kube-api-access-6qt97\") on node \"crc\" DevicePath \"\"" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.742629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" event={"ID":"aee17d13-1b0d-49a2-a515-cc63a2f62c63","Type":"ContainerDied","Data":"a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473"} Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.742672 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a568e7a6fe65530dd28da7ae2ad6b5b1d61c7ccbd4e7e6aa0b31eb0dd34c3473" Feb 20 08:56:24 crc kubenswrapper[5094]: I0220 08:56:24.742781 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m" Feb 20 08:56:24 crc kubenswrapper[5094]: E0220 08:56:24.845290 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaee17d13_1b0d_49a2_a515_cc63a2f62c63.slice\": RecentStats: unable to find data in memory cache]" Feb 20 08:56:25 crc kubenswrapper[5094]: I0220 08:56:25.847357 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:56:25 crc kubenswrapper[5094]: E0220 08:56:25.847909 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:33 crc kubenswrapper[5094]: I0220 08:56:33.060272 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:56:33 crc kubenswrapper[5094]: I0220 08:56:33.072295 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-4ggkd"] Feb 20 08:56:33 crc kubenswrapper[5094]: I0220 08:56:33.857308 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991" path="/var/lib/kubelet/pods/3d9bbb80-f9cc-40ef-b3d1-c5a5cea72991/volumes" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.031853 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.040243 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-dc9b-account-create-update-6hd8k"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.778655 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6"] Feb 20 08:56:34 crc kubenswrapper[5094]: E0220 08:56:34.779579 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee17d13-1b0d-49a2-a515-cc63a2f62c63" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.779614 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee17d13-1b0d-49a2-a515-cc63a2f62c63" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 20 08:56:34 crc kubenswrapper[5094]: E0220 08:56:34.779670 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750a5132-7613-40c0-a360-2f1a589d2554" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.779684 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="750a5132-7613-40c0-a360-2f1a589d2554" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.780150 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee17d13-1b0d-49a2-a515-cc63a2f62c63" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.780193 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="750a5132-7613-40c0-a360-2f1a589d2554" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-networ" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.781394 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.784011 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.784152 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.784572 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.784821 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.794541 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.796023 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.797941 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.798396 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804257 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804344 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804404 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804429 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804456 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804591 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804675 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804729 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.804850 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.818883 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.832128 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6"] Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906519 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906759 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906784 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906876 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.906989 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.907018 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.907186 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.912271 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.912517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.912600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.913142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.920202 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.920887 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.925232 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.927404 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:34 crc kubenswrapper[5094]: I0220 08:56:34.930888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.122168 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.134683 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.722163 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6"] Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.826241 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f"] Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.859638 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7475a056-ad82-42aa-85ee-4b5d6834434a" path="/var/lib/kubelet/pods/7475a056-ad82-42aa-85ee-4b5d6834434a/volumes" Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.870401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" event={"ID":"110791b2-a067-409d-9970-9db4868f0d4d","Type":"ContainerStarted","Data":"0c39006c151d238a3f167340ddd6cab9c442fbcdd2b65206e97f1302317761f1"} Feb 20 08:56:35 crc kubenswrapper[5094]: I0220 08:56:35.872285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" event={"ID":"7894eb94-d4dd-4035-af5b-5994b4ae6d2f","Type":"ContainerStarted","Data":"ca240917156fd9c5acb4173083f283943d7a62c3a3cbea59cf29d0d548923640"} Feb 20 08:56:38 crc kubenswrapper[5094]: I0220 08:56:38.934189 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" event={"ID":"110791b2-a067-409d-9970-9db4868f0d4d","Type":"ContainerStarted","Data":"24186d99d2e091e90cea103f3ededd5ae0be73e5479d2f80e87c425b36de8252"} Feb 20 08:56:38 crc kubenswrapper[5094]: I0220 08:56:38.937312 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" event={"ID":"7894eb94-d4dd-4035-af5b-5994b4ae6d2f","Type":"ContainerStarted","Data":"a63a79298a636726404dd99afe9f61c35ff225d0787a9c54da454b3cf54a459f"} Feb 20 08:56:38 crc kubenswrapper[5094]: I0220 08:56:38.960291 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" podStartSLOduration=2.66317616 podStartE2EDuration="4.96027409s" podCreationTimestamp="2026-02-20 08:56:34 +0000 UTC" firstStartedPulling="2026-02-20 08:56:35.736200064 +0000 UTC m=+7810.608826795" lastFinishedPulling="2026-02-20 08:56:38.033298004 +0000 UTC m=+7812.905924725" observedRunningTime="2026-02-20 08:56:38.948516657 +0000 UTC m=+7813.821143368" watchObservedRunningTime="2026-02-20 08:56:38.96027409 +0000 UTC m=+7813.832900801" Feb 20 08:56:38 crc kubenswrapper[5094]: I0220 08:56:38.970647 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" podStartSLOduration=2.598907143 podStartE2EDuration="4.970622928s" podCreationTimestamp="2026-02-20 08:56:34 +0000 UTC" firstStartedPulling="2026-02-20 08:56:35.829920928 +0000 UTC m=+7810.702547639" lastFinishedPulling="2026-02-20 08:56:38.201636713 +0000 UTC m=+7813.074263424" observedRunningTime="2026-02-20 08:56:38.968824085 +0000 UTC m=+7813.841450796" watchObservedRunningTime="2026-02-20 08:56:38.970622928 +0000 UTC m=+7813.843249679" Feb 20 08:56:39 crc kubenswrapper[5094]: I0220 08:56:39.841815 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:56:39 crc kubenswrapper[5094]: E0220 08:56:39.842867 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:42 crc kubenswrapper[5094]: I0220 08:56:42.575585 5094 scope.go:117] "RemoveContainer" containerID="304a415638ed4b15154b7c41ed2540ace3b8c9ba6f1ff38016fbc2160491bb9c" Feb 20 08:56:42 crc kubenswrapper[5094]: I0220 08:56:42.615358 5094 scope.go:117] "RemoveContainer" containerID="8c8e45f1f20160f7c96239278e797286c527baed5911fbde0292c56051da6b16" Feb 20 08:56:42 crc kubenswrapper[5094]: I0220 08:56:42.700020 5094 scope.go:117] "RemoveContainer" containerID="e06fc5fd3620d2019a01f12c26721ba58935bf528cffce9cee66802b4ab5054a" Feb 20 08:56:53 crc kubenswrapper[5094]: I0220 08:56:53.842342 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:56:53 crc kubenswrapper[5094]: E0220 08:56:53.843376 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:56:59 crc kubenswrapper[5094]: I0220 08:56:59.045883 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:56:59 crc kubenswrapper[5094]: I0220 08:56:59.054337 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-n78qt"] Feb 20 08:56:59 crc kubenswrapper[5094]: I0220 08:56:59.852994 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cbb5a80-aef6-405d-bf92-d0d9cc872c78" path="/var/lib/kubelet/pods/0cbb5a80-aef6-405d-bf92-d0d9cc872c78/volumes" Feb 20 08:57:05 crc kubenswrapper[5094]: I0220 08:57:05.847871 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:57:05 crc kubenswrapper[5094]: E0220 08:57:05.849114 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:57:20 crc kubenswrapper[5094]: I0220 08:57:20.840857 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:57:20 crc kubenswrapper[5094]: E0220 08:57:20.841498 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:57:34 crc kubenswrapper[5094]: I0220 08:57:34.841777 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:57:34 crc kubenswrapper[5094]: E0220 08:57:34.843959 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:57:42 crc kubenswrapper[5094]: I0220 08:57:42.826525 5094 scope.go:117] "RemoveContainer" containerID="08a5be203668aa394a410644bd2e295ff1d095189ea60c660ca9c1800575b1fd" Feb 20 08:57:48 crc kubenswrapper[5094]: I0220 08:57:48.841070 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:57:48 crc kubenswrapper[5094]: E0220 08:57:48.842190 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:03 crc kubenswrapper[5094]: I0220 08:58:03.841344 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:03 crc kubenswrapper[5094]: E0220 08:58:03.842609 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:17 crc kubenswrapper[5094]: I0220 08:58:17.841030 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:17 crc kubenswrapper[5094]: E0220 08:58:17.842459 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:28 crc kubenswrapper[5094]: I0220 08:58:28.841319 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:28 crc kubenswrapper[5094]: E0220 08:58:28.842115 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.586621 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.589571 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.619100 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.712653 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.712854 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.713050 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.815380 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.815489 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.815534 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.816086 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.816129 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.832292 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") pod \"redhat-marketplace-fhdkc\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:29 crc kubenswrapper[5094]: I0220 08:58:29.930408 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:30 crc kubenswrapper[5094]: I0220 08:58:30.442165 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:31 crc kubenswrapper[5094]: I0220 08:58:31.170577 5094 generic.go:334] "Generic (PLEG): container finished" podID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerID="75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a" exitCode=0 Feb 20 08:58:31 crc kubenswrapper[5094]: I0220 08:58:31.170675 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerDied","Data":"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a"} Feb 20 08:58:31 crc kubenswrapper[5094]: I0220 08:58:31.171007 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerStarted","Data":"35f7059f97b45604919176cfdf23aa8bb5a05b3d1142abcc4ccdc4a4de0f3a19"} Feb 20 08:58:32 crc kubenswrapper[5094]: I0220 08:58:32.182801 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerStarted","Data":"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0"} Feb 20 08:58:33 crc kubenswrapper[5094]: I0220 08:58:33.194577 5094 generic.go:334] "Generic (PLEG): container finished" podID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerID="2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0" exitCode=0 Feb 20 08:58:33 crc kubenswrapper[5094]: I0220 08:58:33.194730 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerDied","Data":"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0"} Feb 20 08:58:33 crc kubenswrapper[5094]: I0220 08:58:33.197776 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 08:58:34 crc kubenswrapper[5094]: I0220 08:58:34.205398 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerStarted","Data":"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451"} Feb 20 08:58:34 crc kubenswrapper[5094]: I0220 08:58:34.231614 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fhdkc" podStartSLOduration=2.7863152429999998 podStartE2EDuration="5.231597816s" podCreationTimestamp="2026-02-20 08:58:29 +0000 UTC" firstStartedPulling="2026-02-20 08:58:31.172418227 +0000 UTC m=+7926.045044938" lastFinishedPulling="2026-02-20 08:58:33.61770079 +0000 UTC m=+7928.490327511" observedRunningTime="2026-02-20 08:58:34.22760326 +0000 UTC m=+7929.100230001" watchObservedRunningTime="2026-02-20 08:58:34.231597816 +0000 UTC m=+7929.104224537" Feb 20 08:58:39 crc kubenswrapper[5094]: I0220 08:58:39.931087 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:39 crc kubenswrapper[5094]: I0220 08:58:39.933408 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:39 crc kubenswrapper[5094]: I0220 08:58:39.987260 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:40 crc kubenswrapper[5094]: I0220 08:58:40.325754 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:40 crc kubenswrapper[5094]: I0220 08:58:40.387006 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:40 crc kubenswrapper[5094]: I0220 08:58:40.841259 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:40 crc kubenswrapper[5094]: E0220 08:58:40.842003 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.287969 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fhdkc" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="registry-server" containerID="cri-o://a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" gracePeriod=2 Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.821223 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.944699 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") pod \"49822999-3ac2-4409-ac86-dec02d5e0e7b\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.944847 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") pod \"49822999-3ac2-4409-ac86-dec02d5e0e7b\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.945023 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") pod \"49822999-3ac2-4409-ac86-dec02d5e0e7b\" (UID: \"49822999-3ac2-4409-ac86-dec02d5e0e7b\") " Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.945724 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities" (OuterVolumeSpecName: "utilities") pod "49822999-3ac2-4409-ac86-dec02d5e0e7b" (UID: "49822999-3ac2-4409-ac86-dec02d5e0e7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.958967 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf" (OuterVolumeSpecName: "kube-api-access-jgqnf") pod "49822999-3ac2-4409-ac86-dec02d5e0e7b" (UID: "49822999-3ac2-4409-ac86-dec02d5e0e7b"). InnerVolumeSpecName "kube-api-access-jgqnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 08:58:42 crc kubenswrapper[5094]: I0220 08:58:42.967785 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49822999-3ac2-4409-ac86-dec02d5e0e7b" (UID: "49822999-3ac2-4409-ac86-dec02d5e0e7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.047304 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgqnf\" (UniqueName: \"kubernetes.io/projected/49822999-3ac2-4409-ac86-dec02d5e0e7b-kube-api-access-jgqnf\") on node \"crc\" DevicePath \"\"" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.047342 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.047357 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49822999-3ac2-4409-ac86-dec02d5e0e7b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304638 5094 generic.go:334] "Generic (PLEG): container finished" podID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerID="a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" exitCode=0 Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304685 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerDied","Data":"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451"} Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304736 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fhdkc" event={"ID":"49822999-3ac2-4409-ac86-dec02d5e0e7b","Type":"ContainerDied","Data":"35f7059f97b45604919176cfdf23aa8bb5a05b3d1142abcc4ccdc4a4de0f3a19"} Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304757 5094 scope.go:117] "RemoveContainer" containerID="a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.304886 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fhdkc" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.360676 5094 scope.go:117] "RemoveContainer" containerID="2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.373490 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.388909 5094 scope.go:117] "RemoveContainer" containerID="75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.426513 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fhdkc"] Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.440405 5094 scope.go:117] "RemoveContainer" containerID="a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" Feb 20 08:58:43 crc kubenswrapper[5094]: E0220 08:58:43.440931 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451\": container with ID starting with a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451 not found: ID does not exist" containerID="a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.440964 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451"} err="failed to get container status \"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451\": rpc error: code = NotFound desc = could not find container \"a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451\": container with ID starting with a9bf6fe2a51f8f83dcab18cae27fa1917088ada43ce12a6b73b799611a135451 not found: ID does not exist" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.440984 5094 scope.go:117] "RemoveContainer" containerID="2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0" Feb 20 08:58:43 crc kubenswrapper[5094]: E0220 08:58:43.441537 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0\": container with ID starting with 2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0 not found: ID does not exist" containerID="2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.441572 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0"} err="failed to get container status \"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0\": rpc error: code = NotFound desc = could not find container \"2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0\": container with ID starting with 2683b63c2969a70e315e87712e434f12adf29d5d7a0c2be7d34732a038d812d0 not found: ID does not exist" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.441593 5094 scope.go:117] "RemoveContainer" containerID="75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a" Feb 20 08:58:43 crc kubenswrapper[5094]: E0220 08:58:43.442177 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a\": container with ID starting with 75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a not found: ID does not exist" containerID="75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.442208 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a"} err="failed to get container status \"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a\": rpc error: code = NotFound desc = could not find container \"75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a\": container with ID starting with 75b7b66bdfbe10a707243b683836f8920fd4552e89d519d25c2461ed4ea9267a not found: ID does not exist" Feb 20 08:58:43 crc kubenswrapper[5094]: I0220 08:58:43.859653 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" path="/var/lib/kubelet/pods/49822999-3ac2-4409-ac86-dec02d5e0e7b/volumes" Feb 20 08:58:55 crc kubenswrapper[5094]: I0220 08:58:55.851323 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:58:55 crc kubenswrapper[5094]: E0220 08:58:55.852304 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:59:09 crc kubenswrapper[5094]: I0220 08:59:09.840730 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:59:09 crc kubenswrapper[5094]: E0220 08:59:09.841563 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:59:24 crc kubenswrapper[5094]: I0220 08:59:24.840653 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:59:24 crc kubenswrapper[5094]: E0220 08:59:24.841540 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:59:36 crc kubenswrapper[5094]: I0220 08:59:36.840341 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:59:36 crc kubenswrapper[5094]: E0220 08:59:36.841212 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 08:59:48 crc kubenswrapper[5094]: I0220 08:59:48.840864 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 08:59:48 crc kubenswrapper[5094]: E0220 08:59:48.841816 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.183827 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:00:00 crc kubenswrapper[5094]: E0220 09:00:00.184863 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="extract-utilities" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.184881 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="extract-utilities" Feb 20 09:00:00 crc kubenswrapper[5094]: E0220 09:00:00.184894 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="registry-server" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.184899 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="registry-server" Feb 20 09:00:00 crc kubenswrapper[5094]: E0220 09:00:00.184920 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="extract-content" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.184927 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="extract-content" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.185121 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="49822999-3ac2-4409-ac86-dec02d5e0e7b" containerName="registry-server" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.185881 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.189891 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.191871 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.193024 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.279237 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.279309 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.279351 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.383186 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.383408 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.383468 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.384162 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.394641 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.446576 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") pod \"collect-profiles-29526300-bqm8l\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.514085 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:00 crc kubenswrapper[5094]: I0220 09:00:00.840695 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 09:00:00 crc kubenswrapper[5094]: E0220 09:00:00.841271 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:00:01 crc kubenswrapper[5094]: I0220 09:00:01.041940 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:00:01 crc kubenswrapper[5094]: I0220 09:00:01.105748 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" event={"ID":"30508b7a-ac76-48d8-822c-65a32552ca80","Type":"ContainerStarted","Data":"8a0b65f8627b8068f81e5abc2ed7bbee8aeb2ab222b31511575cdd1d3322e1f9"} Feb 20 09:00:02 crc kubenswrapper[5094]: I0220 09:00:02.116361 5094 generic.go:334] "Generic (PLEG): container finished" podID="30508b7a-ac76-48d8-822c-65a32552ca80" containerID="9dd7ec3da040b20e94b1ef4ad1e6147baa04fe89a2ea1cd7d18c7a1def8587f9" exitCode=0 Feb 20 09:00:02 crc kubenswrapper[5094]: I0220 09:00:02.116439 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" event={"ID":"30508b7a-ac76-48d8-822c-65a32552ca80","Type":"ContainerDied","Data":"9dd7ec3da040b20e94b1ef4ad1e6147baa04fe89a2ea1cd7d18c7a1def8587f9"} Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.487033 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.565950 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") pod \"30508b7a-ac76-48d8-822c-65a32552ca80\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.566167 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") pod \"30508b7a-ac76-48d8-822c-65a32552ca80\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.566212 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") pod \"30508b7a-ac76-48d8-822c-65a32552ca80\" (UID: \"30508b7a-ac76-48d8-822c-65a32552ca80\") " Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.566625 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume" (OuterVolumeSpecName: "config-volume") pod "30508b7a-ac76-48d8-822c-65a32552ca80" (UID: "30508b7a-ac76-48d8-822c-65a32552ca80"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.571195 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz" (OuterVolumeSpecName: "kube-api-access-8n4sz") pod "30508b7a-ac76-48d8-822c-65a32552ca80" (UID: "30508b7a-ac76-48d8-822c-65a32552ca80"). InnerVolumeSpecName "kube-api-access-8n4sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.571519 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "30508b7a-ac76-48d8-822c-65a32552ca80" (UID: "30508b7a-ac76-48d8-822c-65a32552ca80"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.669559 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/30508b7a-ac76-48d8-822c-65a32552ca80-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.669614 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/30508b7a-ac76-48d8-822c-65a32552ca80-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:03 crc kubenswrapper[5094]: I0220 09:00:03.669636 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8n4sz\" (UniqueName: \"kubernetes.io/projected/30508b7a-ac76-48d8-822c-65a32552ca80-kube-api-access-8n4sz\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.135014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" event={"ID":"30508b7a-ac76-48d8-822c-65a32552ca80","Type":"ContainerDied","Data":"8a0b65f8627b8068f81e5abc2ed7bbee8aeb2ab222b31511575cdd1d3322e1f9"} Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.135572 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a0b65f8627b8068f81e5abc2ed7bbee8aeb2ab222b31511575cdd1d3322e1f9" Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.135091 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l" Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.563731 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 09:00:04 crc kubenswrapper[5094]: I0220 09:00:04.572083 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526255-vpwzr"] Feb 20 09:00:05 crc kubenswrapper[5094]: I0220 09:00:05.851424 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b1b88d4-fc9b-465d-907e-7abf6c46c919" path="/var/lib/kubelet/pods/0b1b88d4-fc9b-465d-907e-7abf6c46c919/volumes" Feb 20 09:00:14 crc kubenswrapper[5094]: I0220 09:00:14.841546 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 09:00:15 crc kubenswrapper[5094]: I0220 09:00:15.232132 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33"} Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.266974 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:33 crc kubenswrapper[5094]: E0220 09:00:33.268379 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30508b7a-ac76-48d8-822c-65a32552ca80" containerName="collect-profiles" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.268400 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="30508b7a-ac76-48d8-822c-65a32552ca80" containerName="collect-profiles" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.268920 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="30508b7a-ac76-48d8-822c-65a32552ca80" containerName="collect-profiles" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.271575 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.276273 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.471448 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.471822 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.471857 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574020 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574095 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574121 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574546 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.574688 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.593790 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") pod \"redhat-operators-gnd5s\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:33 crc kubenswrapper[5094]: I0220 09:00:33.893073 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:34 crc kubenswrapper[5094]: I0220 09:00:34.408601 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:34 crc kubenswrapper[5094]: W0220 09:00:34.414344 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35bec39e_02b7_45c3_b331_ac00eb024eca.slice/crio-db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea WatchSource:0}: Error finding container db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea: Status 404 returned error can't find the container with id db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea Feb 20 09:00:34 crc kubenswrapper[5094]: I0220 09:00:34.434177 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerStarted","Data":"db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea"} Feb 20 09:00:35 crc kubenswrapper[5094]: I0220 09:00:35.445924 5094 generic.go:334] "Generic (PLEG): container finished" podID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerID="f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519" exitCode=0 Feb 20 09:00:35 crc kubenswrapper[5094]: I0220 09:00:35.445977 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerDied","Data":"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519"} Feb 20 09:00:36 crc kubenswrapper[5094]: I0220 09:00:36.459982 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerStarted","Data":"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb"} Feb 20 09:00:41 crc kubenswrapper[5094]: I0220 09:00:41.510110 5094 generic.go:334] "Generic (PLEG): container finished" podID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerID="6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb" exitCode=0 Feb 20 09:00:41 crc kubenswrapper[5094]: I0220 09:00:41.510185 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerDied","Data":"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb"} Feb 20 09:00:42 crc kubenswrapper[5094]: I0220 09:00:42.522079 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerStarted","Data":"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885"} Feb 20 09:00:42 crc kubenswrapper[5094]: I0220 09:00:42.552661 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gnd5s" podStartSLOduration=3.060362197 podStartE2EDuration="9.552642058s" podCreationTimestamp="2026-02-20 09:00:33 +0000 UTC" firstStartedPulling="2026-02-20 09:00:35.44846695 +0000 UTC m=+8050.321093661" lastFinishedPulling="2026-02-20 09:00:41.940746811 +0000 UTC m=+8056.813373522" observedRunningTime="2026-02-20 09:00:42.545019836 +0000 UTC m=+8057.417646547" watchObservedRunningTime="2026-02-20 09:00:42.552642058 +0000 UTC m=+8057.425268769" Feb 20 09:00:42 crc kubenswrapper[5094]: I0220 09:00:42.935413 5094 scope.go:117] "RemoveContainer" containerID="df01effbb937965fa2da8671ac9d455b13bf8b90f5ca8dcd5de48223dc6408d9" Feb 20 09:00:43 crc kubenswrapper[5094]: I0220 09:00:43.893419 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:43 crc kubenswrapper[5094]: I0220 09:00:43.893755 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:44 crc kubenswrapper[5094]: I0220 09:00:44.949516 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gnd5s" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" probeResult="failure" output=< Feb 20 09:00:44 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:00:44 crc kubenswrapper[5094]: > Feb 20 09:00:53 crc kubenswrapper[5094]: I0220 09:00:53.974196 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.034571 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.060579 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.091462 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.111744 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-b8a1-account-create-update-27rkz"] Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.121504 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-zdzg9"] Feb 20 09:00:54 crc kubenswrapper[5094]: I0220 09:00:54.229645 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:55 crc kubenswrapper[5094]: I0220 09:00:55.637379 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gnd5s" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" containerID="cri-o://4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" gracePeriod=2 Feb 20 09:00:55 crc kubenswrapper[5094]: I0220 09:00:55.855226 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1068d86d-d730-4dab-8aaf-12c5a5c62a70" path="/var/lib/kubelet/pods/1068d86d-d730-4dab-8aaf-12c5a5c62a70/volumes" Feb 20 09:00:55 crc kubenswrapper[5094]: I0220 09:00:55.856763 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="674e60ac-3253-4c4c-8e5b-7a59ed2e8989" path="/var/lib/kubelet/pods/674e60ac-3253-4c4c-8e5b-7a59ed2e8989/volumes" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.140070 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.245953 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") pod \"35bec39e-02b7-45c3-b331-ac00eb024eca\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.246100 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") pod \"35bec39e-02b7-45c3-b331-ac00eb024eca\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.246240 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") pod \"35bec39e-02b7-45c3-b331-ac00eb024eca\" (UID: \"35bec39e-02b7-45c3-b331-ac00eb024eca\") " Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.248063 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities" (OuterVolumeSpecName: "utilities") pod "35bec39e-02b7-45c3-b331-ac00eb024eca" (UID: "35bec39e-02b7-45c3-b331-ac00eb024eca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.254834 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q" (OuterVolumeSpecName: "kube-api-access-bdn7q") pod "35bec39e-02b7-45c3-b331-ac00eb024eca" (UID: "35bec39e-02b7-45c3-b331-ac00eb024eca"). InnerVolumeSpecName "kube-api-access-bdn7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.352791 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdn7q\" (UniqueName: \"kubernetes.io/projected/35bec39e-02b7-45c3-b331-ac00eb024eca-kube-api-access-bdn7q\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.353020 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.400181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35bec39e-02b7-45c3-b331-ac00eb024eca" (UID: "35bec39e-02b7-45c3-b331-ac00eb024eca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.455078 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35bec39e-02b7-45c3-b331-ac00eb024eca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649695 5094 generic.go:334] "Generic (PLEG): container finished" podID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerID="4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" exitCode=0 Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649763 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerDied","Data":"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885"} Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649789 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gnd5s" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649795 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gnd5s" event={"ID":"35bec39e-02b7-45c3-b331-ac00eb024eca","Type":"ContainerDied","Data":"db3a6e6f72d8eeea9c87bd2024f02efe99a147a1c89ab21cf3dc23b3c906cbea"} Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.649830 5094 scope.go:117] "RemoveContainer" containerID="4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.695728 5094 scope.go:117] "RemoveContainer" containerID="6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.700185 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.709189 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gnd5s"] Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.718643 5094 scope.go:117] "RemoveContainer" containerID="f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.786868 5094 scope.go:117] "RemoveContainer" containerID="4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" Feb 20 09:00:56 crc kubenswrapper[5094]: E0220 09:00:56.787456 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885\": container with ID starting with 4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885 not found: ID does not exist" containerID="4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.787507 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885"} err="failed to get container status \"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885\": rpc error: code = NotFound desc = could not find container \"4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885\": container with ID starting with 4b1a2f4e739eb6f0aa6b0eb92126ac137e0940d2ffe99cec196fa67181718885 not found: ID does not exist" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.787537 5094 scope.go:117] "RemoveContainer" containerID="6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb" Feb 20 09:00:56 crc kubenswrapper[5094]: E0220 09:00:56.788272 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb\": container with ID starting with 6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb not found: ID does not exist" containerID="6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.788316 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb"} err="failed to get container status \"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb\": rpc error: code = NotFound desc = could not find container \"6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb\": container with ID starting with 6da3f006b736a28f2fd1d6d86e5f81251619c126785be922218f3dc59a01e2eb not found: ID does not exist" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.788369 5094 scope.go:117] "RemoveContainer" containerID="f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519" Feb 20 09:00:56 crc kubenswrapper[5094]: E0220 09:00:56.788751 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519\": container with ID starting with f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519 not found: ID does not exist" containerID="f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519" Feb 20 09:00:56 crc kubenswrapper[5094]: I0220 09:00:56.788785 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519"} err="failed to get container status \"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519\": rpc error: code = NotFound desc = could not find container \"f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519\": container with ID starting with f84b0e896a026e8b0994b8e53246f6056d2491862161506181502cfe56c6f519 not found: ID does not exist" Feb 20 09:00:57 crc kubenswrapper[5094]: I0220 09:00:57.857121 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" path="/var/lib/kubelet/pods/35bec39e-02b7-45c3-b331-ac00eb024eca/volumes" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.147909 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29526301-wg5rm"] Feb 20 09:01:00 crc kubenswrapper[5094]: E0220 09:01:00.148587 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.148600 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" Feb 20 09:01:00 crc kubenswrapper[5094]: E0220 09:01:00.148653 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="extract-utilities" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.148660 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="extract-utilities" Feb 20 09:01:00 crc kubenswrapper[5094]: E0220 09:01:00.148674 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="extract-content" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.148681 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="extract-content" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.148944 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="35bec39e-02b7-45c3-b331-ac00eb024eca" containerName="registry-server" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.149680 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.164000 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526301-wg5rm"] Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.240848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.240912 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.241076 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.241116 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.342978 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.343123 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.343154 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.343222 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.349554 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.349920 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.361280 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.362303 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") pod \"keystone-cron-29526301-wg5rm\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.483678 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:00 crc kubenswrapper[5094]: I0220 09:01:00.967163 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526301-wg5rm"] Feb 20 09:01:00 crc kubenswrapper[5094]: W0220 09:01:00.972446 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba68c6b8_04f9_4515_8e85_3e7b4ca9615b.slice/crio-d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8 WatchSource:0}: Error finding container d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8: Status 404 returned error can't find the container with id d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8 Feb 20 09:01:01 crc kubenswrapper[5094]: I0220 09:01:01.702146 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526301-wg5rm" event={"ID":"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b","Type":"ContainerStarted","Data":"dd6f537c0f31c70b8ddd13f131a51bcaf7ca2a867d0122b03fdb3921b3efdb82"} Feb 20 09:01:01 crc kubenswrapper[5094]: I0220 09:01:01.702728 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526301-wg5rm" event={"ID":"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b","Type":"ContainerStarted","Data":"d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8"} Feb 20 09:01:01 crc kubenswrapper[5094]: I0220 09:01:01.728330 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29526301-wg5rm" podStartSLOduration=1.728314135 podStartE2EDuration="1.728314135s" podCreationTimestamp="2026-02-20 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:01:01.721993132 +0000 UTC m=+8076.594619843" watchObservedRunningTime="2026-02-20 09:01:01.728314135 +0000 UTC m=+8076.600940836" Feb 20 09:01:03 crc kubenswrapper[5094]: I0220 09:01:03.719600 5094 generic.go:334] "Generic (PLEG): container finished" podID="ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" containerID="dd6f537c0f31c70b8ddd13f131a51bcaf7ca2a867d0122b03fdb3921b3efdb82" exitCode=0 Feb 20 09:01:03 crc kubenswrapper[5094]: I0220 09:01:03.719693 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526301-wg5rm" event={"ID":"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b","Type":"ContainerDied","Data":"dd6f537c0f31c70b8ddd13f131a51bcaf7ca2a867d0122b03fdb3921b3efdb82"} Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.159653 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.340449 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") pod \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.340966 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") pod \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.341131 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") pod \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.341673 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") pod \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\" (UID: \"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b\") " Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.345880 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn" (OuterVolumeSpecName: "kube-api-access-tfrhn") pod "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" (UID: "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b"). InnerVolumeSpecName "kube-api-access-tfrhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.357961 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" (UID: "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.366362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" (UID: "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.412325 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data" (OuterVolumeSpecName: "config-data") pod "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" (UID: "ba68c6b8-04f9-4515-8e85-3e7b4ca9615b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.443517 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.443546 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.443556 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfrhn\" (UniqueName: \"kubernetes.io/projected/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-kube-api-access-tfrhn\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.443566 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba68c6b8-04f9-4515-8e85-3e7b4ca9615b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.740661 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526301-wg5rm" event={"ID":"ba68c6b8-04f9-4515-8e85-3e7b4ca9615b","Type":"ContainerDied","Data":"d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8"} Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.741007 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0b5b818e33cee572381952b100d70bc745d719ac9d8e8b95a5d761c2a20a9c8" Feb 20 09:01:05 crc kubenswrapper[5094]: I0220 09:01:05.740728 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526301-wg5rm" Feb 20 09:01:06 crc kubenswrapper[5094]: I0220 09:01:06.752641 5094 generic.go:334] "Generic (PLEG): container finished" podID="7894eb94-d4dd-4035-af5b-5994b4ae6d2f" containerID="a63a79298a636726404dd99afe9f61c35ff225d0787a9c54da454b3cf54a459f" exitCode=0 Feb 20 09:01:06 crc kubenswrapper[5094]: I0220 09:01:06.752769 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" event={"ID":"7894eb94-d4dd-4035-af5b-5994b4ae6d2f","Type":"ContainerDied","Data":"a63a79298a636726404dd99afe9f61c35ff225d0787a9c54da454b3cf54a459f"} Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.050320 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.059903 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-fmrw9"] Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.239150 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.401740 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.402061 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.402135 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.402220 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.408874 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq" (OuterVolumeSpecName: "kube-api-access-tcppq") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f"). InnerVolumeSpecName "kube-api-access-tcppq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.409519 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:08 crc kubenswrapper[5094]: E0220 09:01:08.431418 5094 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory podName:7894eb94-d4dd-4035-af5b-5994b4ae6d2f nodeName:}" failed. No retries permitted until 2026-02-20 09:01:08.931383848 +0000 UTC m=+8083.804010559 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f") : error deleting /var/lib/kubelet/pods/7894eb94-d4dd-4035-af5b-5994b4ae6d2f/volume-subpaths: remove /var/lib/kubelet/pods/7894eb94-d4dd-4035-af5b-5994b4ae6d2f/volume-subpaths: no such file or directory Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.435266 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.505653 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.505765 5094 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.505799 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcppq\" (UniqueName: \"kubernetes.io/projected/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-kube-api-access-tcppq\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.775131 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" event={"ID":"7894eb94-d4dd-4035-af5b-5994b4ae6d2f","Type":"ContainerDied","Data":"ca240917156fd9c5acb4173083f283943d7a62c3a3cbea59cf29d0d548923640"} Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.775190 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca240917156fd9c5acb4173083f283943d7a62c3a3cbea59cf29d0d548923640" Feb 20 09:01:08 crc kubenswrapper[5094]: I0220 09:01:08.775201 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f" Feb 20 09:01:09 crc kubenswrapper[5094]: I0220 09:01:09.015979 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") pod \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\" (UID: \"7894eb94-d4dd-4035-af5b-5994b4ae6d2f\") " Feb 20 09:01:09 crc kubenswrapper[5094]: I0220 09:01:09.021891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory" (OuterVolumeSpecName: "inventory") pod "7894eb94-d4dd-4035-af5b-5994b4ae6d2f" (UID: "7894eb94-d4dd-4035-af5b-5994b4ae6d2f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:01:09 crc kubenswrapper[5094]: I0220 09:01:09.119203 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7894eb94-d4dd-4035-af5b-5994b4ae6d2f-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:01:09 crc kubenswrapper[5094]: I0220 09:01:09.854347 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="791e2b3b-9d51-41fd-bf38-5b66849b5b77" path="/var/lib/kubelet/pods/791e2b3b-9d51-41fd-bf38-5b66849b5b77/volumes" Feb 20 09:01:43 crc kubenswrapper[5094]: I0220 09:01:43.014813 5094 scope.go:117] "RemoveContainer" containerID="54e699ae44cd59613f594b23e884598e72478881e5bbc4353851926cc53ca349" Feb 20 09:01:43 crc kubenswrapper[5094]: I0220 09:01:43.043589 5094 scope.go:117] "RemoveContainer" containerID="da9d0c1b854591282ea3fc19512d062edc11f81e6daa7e4388a1d825d28d97b3" Feb 20 09:01:43 crc kubenswrapper[5094]: I0220 09:01:43.089243 5094 scope.go:117] "RemoveContainer" containerID="f3933aedef06f045c06e2f5aeaa2d2665f532f7df1042dc925032507e64641fd" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.040573 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:08 crc kubenswrapper[5094]: E0220 09:02:08.041642 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7894eb94-d4dd-4035-af5b-5994b4ae6d2f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.041658 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="7894eb94-d4dd-4035-af5b-5994b4ae6d2f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 20 09:02:08 crc kubenswrapper[5094]: E0220 09:02:08.041678 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" containerName="keystone-cron" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.041684 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" containerName="keystone-cron" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.041911 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba68c6b8-04f9-4515-8e85-3e7b4ca9615b" containerName="keystone-cron" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.041927 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="7894eb94-d4dd-4035-af5b-5994b4ae6d2f" containerName="tripleo-cleanup-tripleo-cleanup-openstack-networker" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.043389 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.052337 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.113257 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.113366 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.113430 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.215632 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.215792 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.215880 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.216320 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.216365 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.239534 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") pod \"certified-operators-r8h5d\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.389617 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:08 crc kubenswrapper[5094]: I0220 09:02:08.881297 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:09 crc kubenswrapper[5094]: I0220 09:02:09.417001 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerID="7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a" exitCode=0 Feb 20 09:02:09 crc kubenswrapper[5094]: I0220 09:02:09.417088 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerDied","Data":"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a"} Feb 20 09:02:09 crc kubenswrapper[5094]: I0220 09:02:09.417354 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerStarted","Data":"9793592739c2c4402f0646adb70f87a5b67be5909b34100bae0371dc05a317d7"} Feb 20 09:02:10 crc kubenswrapper[5094]: I0220 09:02:10.431183 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerStarted","Data":"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2"} Feb 20 09:02:12 crc kubenswrapper[5094]: I0220 09:02:12.454928 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerID="86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2" exitCode=0 Feb 20 09:02:12 crc kubenswrapper[5094]: I0220 09:02:12.454999 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerDied","Data":"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2"} Feb 20 09:02:13 crc kubenswrapper[5094]: I0220 09:02:13.478411 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerStarted","Data":"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191"} Feb 20 09:02:13 crc kubenswrapper[5094]: I0220 09:02:13.506567 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r8h5d" podStartSLOduration=2.083010457 podStartE2EDuration="5.50651407s" podCreationTimestamp="2026-02-20 09:02:08 +0000 UTC" firstStartedPulling="2026-02-20 09:02:09.419608647 +0000 UTC m=+8144.292235368" lastFinishedPulling="2026-02-20 09:02:12.84311227 +0000 UTC m=+8147.715738981" observedRunningTime="2026-02-20 09:02:13.4973713 +0000 UTC m=+8148.369998011" watchObservedRunningTime="2026-02-20 09:02:13.50651407 +0000 UTC m=+8148.379140781" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.390722 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.391373 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.464463 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.601489 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:18 crc kubenswrapper[5094]: I0220 09:02:18.715411 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:20 crc kubenswrapper[5094]: I0220 09:02:20.548682 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r8h5d" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="registry-server" containerID="cri-o://1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" gracePeriod=2 Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.519728 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571094 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerID="1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" exitCode=0 Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571145 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerDied","Data":"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191"} Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571175 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r8h5d" event={"ID":"5e37664a-95ca-4fbb-8991-0af7286b7b9e","Type":"ContainerDied","Data":"9793592739c2c4402f0646adb70f87a5b67be5909b34100bae0371dc05a317d7"} Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571215 5094 scope.go:117] "RemoveContainer" containerID="1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.571302 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r8h5d" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.593167 5094 scope.go:117] "RemoveContainer" containerID="86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.614950 5094 scope.go:117] "RemoveContainer" containerID="7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.618446 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") pod \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.618519 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") pod \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.618766 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") pod \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\" (UID: \"5e37664a-95ca-4fbb-8991-0af7286b7b9e\") " Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.619490 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities" (OuterVolumeSpecName: "utilities") pod "5e37664a-95ca-4fbb-8991-0af7286b7b9e" (UID: "5e37664a-95ca-4fbb-8991-0af7286b7b9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.626530 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr" (OuterVolumeSpecName: "kube-api-access-f72qr") pod "5e37664a-95ca-4fbb-8991-0af7286b7b9e" (UID: "5e37664a-95ca-4fbb-8991-0af7286b7b9e"). InnerVolumeSpecName "kube-api-access-f72qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.674539 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e37664a-95ca-4fbb-8991-0af7286b7b9e" (UID: "5e37664a-95ca-4fbb-8991-0af7286b7b9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.697639 5094 scope.go:117] "RemoveContainer" containerID="1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" Feb 20 09:02:21 crc kubenswrapper[5094]: E0220 09:02:21.698148 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191\": container with ID starting with 1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191 not found: ID does not exist" containerID="1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.698213 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191"} err="failed to get container status \"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191\": rpc error: code = NotFound desc = could not find container \"1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191\": container with ID starting with 1ec4157db14b26360df8b0fe8411cf32663bce1a3b07f6db3446bcf86afec191 not found: ID does not exist" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.698247 5094 scope.go:117] "RemoveContainer" containerID="86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2" Feb 20 09:02:21 crc kubenswrapper[5094]: E0220 09:02:21.698643 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2\": container with ID starting with 86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2 not found: ID does not exist" containerID="86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.698762 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2"} err="failed to get container status \"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2\": rpc error: code = NotFound desc = could not find container \"86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2\": container with ID starting with 86dcafb75fc2be2feea5a22122ef0934b95506ef5ae71db827d0424bcb3aa7c2 not found: ID does not exist" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.698815 5094 scope.go:117] "RemoveContainer" containerID="7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a" Feb 20 09:02:21 crc kubenswrapper[5094]: E0220 09:02:21.699172 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a\": container with ID starting with 7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a not found: ID does not exist" containerID="7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.699199 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a"} err="failed to get container status \"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a\": rpc error: code = NotFound desc = could not find container \"7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a\": container with ID starting with 7339d0c4e8259b177e2d20693ba2348687ae2cfe39995f853491a1c79073403a not found: ID does not exist" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.721899 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.721945 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e37664a-95ca-4fbb-8991-0af7286b7b9e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.722012 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f72qr\" (UniqueName: \"kubernetes.io/projected/5e37664a-95ca-4fbb-8991-0af7286b7b9e-kube-api-access-f72qr\") on node \"crc\" DevicePath \"\"" Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.898745 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:21 crc kubenswrapper[5094]: I0220 09:02:21.907763 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r8h5d"] Feb 20 09:02:23 crc kubenswrapper[5094]: I0220 09:02:23.851354 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" path="/var/lib/kubelet/pods/5e37664a-95ca-4fbb-8991-0af7286b7b9e/volumes" Feb 20 09:02:34 crc kubenswrapper[5094]: I0220 09:02:34.106924 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:02:34 crc kubenswrapper[5094]: I0220 09:02:34.107523 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:03:04 crc kubenswrapper[5094]: I0220 09:03:04.107795 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:03:04 crc kubenswrapper[5094]: I0220 09:03:04.108463 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.056465 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.064730 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.073273 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-drwqv"] Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.081595 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-4eea-account-create-update-rqsxn"] Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.860085 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="537981f5-8e74-406f-9199-8bac8aa60903" path="/var/lib/kubelet/pods/537981f5-8e74-406f-9199-8bac8aa60903/volumes" Feb 20 09:03:29 crc kubenswrapper[5094]: I0220 09:03:29.861294 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae290c11-18c8-4d9a-90d3-8f2219084a78" path="/var/lib/kubelet/pods/ae290c11-18c8-4d9a-90d3-8f2219084a78/volumes" Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.107124 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.107523 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.107576 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.108568 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.108630 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33" gracePeriod=600 Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.371896 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33" exitCode=0 Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.371954 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33"} Feb 20 09:03:34 crc kubenswrapper[5094]: I0220 09:03:34.372049 5094 scope.go:117] "RemoveContainer" containerID="610f337413344b2cdbfbbb8e5cbe685aa5f42b4737dd322b89c4a89a3919616c" Feb 20 09:03:35 crc kubenswrapper[5094]: I0220 09:03:35.381731 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60"} Feb 20 09:03:42 crc kubenswrapper[5094]: I0220 09:03:42.042842 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 09:03:42 crc kubenswrapper[5094]: I0220 09:03:42.065907 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-bg4qh"] Feb 20 09:03:43 crc kubenswrapper[5094]: I0220 09:03:43.307984 5094 scope.go:117] "RemoveContainer" containerID="d3221fcda11fc25108efa9fb80c6774c8d350491f8d20f83e1f5fae473f8e306" Feb 20 09:03:43 crc kubenswrapper[5094]: I0220 09:03:43.341264 5094 scope.go:117] "RemoveContainer" containerID="7d6c489e10960f3fa343c9c151323e484166523ad5c54d095e27280ce6e1cbd6" Feb 20 09:03:43 crc kubenswrapper[5094]: I0220 09:03:43.854680 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2950b502-5079-4a08-8aaf-f0b5d376a3f2" path="/var/lib/kubelet/pods/2950b502-5079-4a08-8aaf-f0b5d376a3f2/volumes" Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.049956 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.060938 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.069425 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-mtnn8"] Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.080772 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-560b-account-create-update-6s8n2"] Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.857926 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bad0291-15d9-4dc5-acd6-26bc8d8aad76" path="/var/lib/kubelet/pods/6bad0291-15d9-4dc5-acd6-26bc8d8aad76/volumes" Feb 20 09:04:01 crc kubenswrapper[5094]: I0220 09:04:01.859471 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3" path="/var/lib/kubelet/pods/c5cc3f76-dfd1-4d7c-8adb-08cbc55636a3/volumes" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.480803 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:14 crc kubenswrapper[5094]: E0220 09:04:14.482572 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="extract-utilities" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.482614 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="extract-utilities" Feb 20 09:04:14 crc kubenswrapper[5094]: E0220 09:04:14.482652 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="registry-server" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.482670 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="registry-server" Feb 20 09:04:14 crc kubenswrapper[5094]: E0220 09:04:14.482745 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="extract-content" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.482769 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="extract-content" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.483280 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e37664a-95ca-4fbb-8991-0af7286b7b9e" containerName="registry-server" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.486447 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.503197 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.659153 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.659301 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.659353 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.761902 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.762087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.762145 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.762620 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.762749 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.792482 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") pod \"community-operators-hvxdd\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:14 crc kubenswrapper[5094]: I0220 09:04:14.831850 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.048849 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.057274 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-87lwb"] Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.365640 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.799032 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerID="f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d" exitCode=0 Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.799166 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerDied","Data":"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d"} Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.799307 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerStarted","Data":"cf4431142b094e688304265b91f9daccea2c70afd9f108b3b0281393c36eadf9"} Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.801212 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:04:15 crc kubenswrapper[5094]: I0220 09:04:15.858770 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00" path="/var/lib/kubelet/pods/8ff96d1b-1cc5-4b05-ba92-1ed64ec0cb00/volumes" Feb 20 09:04:16 crc kubenswrapper[5094]: I0220 09:04:16.826060 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerStarted","Data":"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d"} Feb 20 09:04:17 crc kubenswrapper[5094]: E0220 09:04:17.820136 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee43a380_78c4_4fdc_957a_a76021a27b53.slice/crio-conmon-91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee43a380_78c4_4fdc_957a_a76021a27b53.slice/crio-91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d.scope\": RecentStats: unable to find data in memory cache]" Feb 20 09:04:17 crc kubenswrapper[5094]: I0220 09:04:17.840162 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerID="91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d" exitCode=0 Feb 20 09:04:17 crc kubenswrapper[5094]: I0220 09:04:17.852454 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerDied","Data":"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d"} Feb 20 09:04:18 crc kubenswrapper[5094]: I0220 09:04:18.853771 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerStarted","Data":"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97"} Feb 20 09:04:18 crc kubenswrapper[5094]: I0220 09:04:18.886042 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hvxdd" podStartSLOduration=2.396745902 podStartE2EDuration="4.88601804s" podCreationTimestamp="2026-02-20 09:04:14 +0000 UTC" firstStartedPulling="2026-02-20 09:04:15.800990339 +0000 UTC m=+8270.673617050" lastFinishedPulling="2026-02-20 09:04:18.290262477 +0000 UTC m=+8273.162889188" observedRunningTime="2026-02-20 09:04:18.876735706 +0000 UTC m=+8273.749362417" watchObservedRunningTime="2026-02-20 09:04:18.88601804 +0000 UTC m=+8273.758644751" Feb 20 09:04:24 crc kubenswrapper[5094]: I0220 09:04:24.832312 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:24 crc kubenswrapper[5094]: I0220 09:04:24.833749 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:24 crc kubenswrapper[5094]: I0220 09:04:24.887463 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:24 crc kubenswrapper[5094]: I0220 09:04:24.953031 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:25 crc kubenswrapper[5094]: I0220 09:04:25.121538 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:26 crc kubenswrapper[5094]: I0220 09:04:26.927202 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" event={"ID":"110791b2-a067-409d-9970-9db4868f0d4d","Type":"ContainerDied","Data":"24186d99d2e091e90cea103f3ededd5ae0be73e5479d2f80e87c425b36de8252"} Feb 20 09:04:26 crc kubenswrapper[5094]: I0220 09:04:26.927128 5094 generic.go:334] "Generic (PLEG): container finished" podID="110791b2-a067-409d-9970-9db4868f0d4d" containerID="24186d99d2e091e90cea103f3ededd5ae0be73e5479d2f80e87c425b36de8252" exitCode=0 Feb 20 09:04:26 crc kubenswrapper[5094]: I0220 09:04:26.927962 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hvxdd" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="registry-server" containerID="cri-o://d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" gracePeriod=2 Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.374318 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.529204 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") pod \"ee43a380-78c4-4fdc-957a-a76021a27b53\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.529393 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") pod \"ee43a380-78c4-4fdc-957a-a76021a27b53\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.529509 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") pod \"ee43a380-78c4-4fdc-957a-a76021a27b53\" (UID: \"ee43a380-78c4-4fdc-957a-a76021a27b53\") " Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.530660 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities" (OuterVolumeSpecName: "utilities") pod "ee43a380-78c4-4fdc-957a-a76021a27b53" (UID: "ee43a380-78c4-4fdc-957a-a76021a27b53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.534610 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86" (OuterVolumeSpecName: "kube-api-access-mjk86") pod "ee43a380-78c4-4fdc-957a-a76021a27b53" (UID: "ee43a380-78c4-4fdc-957a-a76021a27b53"). InnerVolumeSpecName "kube-api-access-mjk86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.581327 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee43a380-78c4-4fdc-957a-a76021a27b53" (UID: "ee43a380-78c4-4fdc-957a-a76021a27b53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.632544 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.632585 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee43a380-78c4-4fdc-957a-a76021a27b53-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.632601 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjk86\" (UniqueName: \"kubernetes.io/projected/ee43a380-78c4-4fdc-957a-a76021a27b53-kube-api-access-mjk86\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.940932 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerID="d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" exitCode=0 Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.941136 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hvxdd" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.941173 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerDied","Data":"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97"} Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.941770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hvxdd" event={"ID":"ee43a380-78c4-4fdc-957a-a76021a27b53","Type":"ContainerDied","Data":"cf4431142b094e688304265b91f9daccea2c70afd9f108b3b0281393c36eadf9"} Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.941813 5094 scope.go:117] "RemoveContainer" containerID="d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.970973 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:27 crc kubenswrapper[5094]: I0220 09:04:27.989163 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hvxdd"] Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.000663 5094 scope.go:117] "RemoveContainer" containerID="91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.026375 5094 scope.go:117] "RemoveContainer" containerID="f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.087888 5094 scope.go:117] "RemoveContainer" containerID="d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" Feb 20 09:04:28 crc kubenswrapper[5094]: E0220 09:04:28.088466 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97\": container with ID starting with d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97 not found: ID does not exist" containerID="d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.088501 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97"} err="failed to get container status \"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97\": rpc error: code = NotFound desc = could not find container \"d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97\": container with ID starting with d9377f47eb257f2928083cfe994b97e5d229e2b4d7e8fc8ff63e0ccf47602a97 not found: ID does not exist" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.088523 5094 scope.go:117] "RemoveContainer" containerID="91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d" Feb 20 09:04:28 crc kubenswrapper[5094]: E0220 09:04:28.089020 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d\": container with ID starting with 91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d not found: ID does not exist" containerID="91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.089042 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d"} err="failed to get container status \"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d\": rpc error: code = NotFound desc = could not find container \"91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d\": container with ID starting with 91ab2a8fa1dcb2274f66124aee2414a73de9ba0489ddc5132143ddf61255c32d not found: ID does not exist" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.089055 5094 scope.go:117] "RemoveContainer" containerID="f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d" Feb 20 09:04:28 crc kubenswrapper[5094]: E0220 09:04:28.089215 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d\": container with ID starting with f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d not found: ID does not exist" containerID="f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.089236 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d"} err="failed to get container status \"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d\": rpc error: code = NotFound desc = could not find container \"f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d\": container with ID starting with f382e9bdcd3b0564a60c5db2dec447e6a631ee1061f879d955cbf44775b58b0d not found: ID does not exist" Feb 20 09:04:28 crc kubenswrapper[5094]: E0220 09:04:28.143874 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee43a380_78c4_4fdc_957a_a76021a27b53.slice/crio-cf4431142b094e688304265b91f9daccea2c70afd9f108b3b0281393c36eadf9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee43a380_78c4_4fdc_957a_a76021a27b53.slice\": RecentStats: unable to find data in memory cache]" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.406296 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.455860 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.456202 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.456791 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.456932 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.457086 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") pod \"110791b2-a067-409d-9970-9db4868f0d4d\" (UID: \"110791b2-a067-409d-9970-9db4868f0d4d\") " Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.461808 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph" (OuterVolumeSpecName: "ceph") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.461999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt" (OuterVolumeSpecName: "kube-api-access-svmxt") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "kube-api-access-svmxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.473169 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.483908 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.498684 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory" (OuterVolumeSpecName: "inventory") pod "110791b2-a067-409d-9970-9db4868f0d4d" (UID: "110791b2-a067-409d-9970-9db4868f0d4d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559188 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svmxt\" (UniqueName: \"kubernetes.io/projected/110791b2-a067-409d-9970-9db4868f0d4d-kube-api-access-svmxt\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559231 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559243 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559252 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.559264 5094 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/110791b2-a067-409d-9970-9db4868f0d4d-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.950876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" event={"ID":"110791b2-a067-409d-9970-9db4868f0d4d","Type":"ContainerDied","Data":"0c39006c151d238a3f167340ddd6cab9c442fbcdd2b65206e97f1302317761f1"} Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.950897 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6" Feb 20 09:04:28 crc kubenswrapper[5094]: I0220 09:04:28.951283 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c39006c151d238a3f167340ddd6cab9c442fbcdd2b65206e97f1302317761f1" Feb 20 09:04:29 crc kubenswrapper[5094]: I0220 09:04:29.852200 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" path="/var/lib/kubelet/pods/ee43a380-78c4-4fdc-957a-a76021a27b53/volumes" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768107 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-m7mmg"] Feb 20 09:04:30 crc kubenswrapper[5094]: E0220 09:04:30.768643 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="extract-utilities" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768663 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="extract-utilities" Feb 20 09:04:30 crc kubenswrapper[5094]: E0220 09:04:30.768685 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="extract-content" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768695 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="extract-content" Feb 20 09:04:30 crc kubenswrapper[5094]: E0220 09:04:30.768757 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="registry-server" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768773 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="registry-server" Feb 20 09:04:30 crc kubenswrapper[5094]: E0220 09:04:30.768791 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110791b2-a067-409d-9970-9db4868f0d4d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.768803 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="110791b2-a067-409d-9970-9db4868f0d4d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.769162 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="110791b2-a067-409d-9970-9db4868f0d4d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.769202 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee43a380-78c4-4fdc-957a-a76021a27b53" containerName="registry-server" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.770224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.779492 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-452ln"] Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.780812 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.781647 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.781952 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.782074 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.782283 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.782780 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.784065 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.803915 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-m7mmg"] Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.811775 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-452ln"] Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906528 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906598 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906647 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906728 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906818 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:30 crc kubenswrapper[5094]: I0220 09:04:30.906889 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008618 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008663 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008825 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008877 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008914 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008957 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.008987 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.009010 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.016517 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.016677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.017014 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.017466 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.023113 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.026631 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.034738 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.036142 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") pod \"bootstrap-openstack-openstack-networker-452ln\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.039327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") pod \"bootstrap-openstack-openstack-cell1-m7mmg\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.086474 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.100094 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.626242 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-m7mmg"] Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.751632 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-networker-452ln"] Feb 20 09:04:31 crc kubenswrapper[5094]: W0220 09:04:31.755085 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3baf01f_744b_44ed_b3c8_2ec288f77e59.slice/crio-c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4 WatchSource:0}: Error finding container c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4: Status 404 returned error can't find the container with id c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4 Feb 20 09:04:31 crc kubenswrapper[5094]: I0220 09:04:31.999401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" event={"ID":"30a55d13-2efe-4d90-bcef-14aedc741079","Type":"ContainerStarted","Data":"b5119a9e3d1361f0b6bbf1a075495f30b7f1bf3071d0a3ad04118e1bd1708bb2"} Feb 20 09:04:32 crc kubenswrapper[5094]: I0220 09:04:32.000413 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-452ln" event={"ID":"e3baf01f-744b-44ed-b3c8-2ec288f77e59","Type":"ContainerStarted","Data":"c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4"} Feb 20 09:04:33 crc kubenswrapper[5094]: I0220 09:04:33.025732 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-452ln" event={"ID":"e3baf01f-744b-44ed-b3c8-2ec288f77e59","Type":"ContainerStarted","Data":"d7b5776600a90eebbccedbaae5eb73db115ddeee382bcc00638e0b193233eedc"} Feb 20 09:04:33 crc kubenswrapper[5094]: I0220 09:04:33.028116 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" event={"ID":"30a55d13-2efe-4d90-bcef-14aedc741079","Type":"ContainerStarted","Data":"28b6a1153e01e932715ae5b8469ca02823b29ac957c328d37bd8a8b4033ff372"} Feb 20 09:04:33 crc kubenswrapper[5094]: I0220 09:04:33.059143 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-networker-452ln" podStartSLOduration=2.668105491 podStartE2EDuration="3.059124068s" podCreationTimestamp="2026-02-20 09:04:30 +0000 UTC" firstStartedPulling="2026-02-20 09:04:31.758227321 +0000 UTC m=+8286.630854032" lastFinishedPulling="2026-02-20 09:04:32.149245898 +0000 UTC m=+8287.021872609" observedRunningTime="2026-02-20 09:04:33.058547944 +0000 UTC m=+8287.931174675" watchObservedRunningTime="2026-02-20 09:04:33.059124068 +0000 UTC m=+8287.931750789" Feb 20 09:04:33 crc kubenswrapper[5094]: I0220 09:04:33.109866 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" podStartSLOduration=2.432953013 podStartE2EDuration="3.109839398s" podCreationTimestamp="2026-02-20 09:04:30 +0000 UTC" firstStartedPulling="2026-02-20 09:04:31.641170854 +0000 UTC m=+8286.513797565" lastFinishedPulling="2026-02-20 09:04:32.318057239 +0000 UTC m=+8287.190683950" observedRunningTime="2026-02-20 09:04:33.08830472 +0000 UTC m=+8287.960931441" watchObservedRunningTime="2026-02-20 09:04:33.109839398 +0000 UTC m=+8287.982466119" Feb 20 09:04:43 crc kubenswrapper[5094]: I0220 09:04:43.444412 5094 scope.go:117] "RemoveContainer" containerID="40e80a7f49d2a8cd8ede69f04221413d74bc3298b5502921432bc86e342a4f7d" Feb 20 09:04:43 crc kubenswrapper[5094]: I0220 09:04:43.472150 5094 scope.go:117] "RemoveContainer" containerID="8325d32b9b097dfce89183c5462e4d5dad44f1febeae290e0597bb55bbe59825" Feb 20 09:04:43 crc kubenswrapper[5094]: I0220 09:04:43.546406 5094 scope.go:117] "RemoveContainer" containerID="b71c497029e9f5437c06dfb05b6008f8e4f7c93cd886b8f44f838d87660036a7" Feb 20 09:04:43 crc kubenswrapper[5094]: I0220 09:04:43.614444 5094 scope.go:117] "RemoveContainer" containerID="fcec52d45c535185ac325065c4cab11c829c4a1ebad6b2123939c3a35f4b9360" Feb 20 09:05:34 crc kubenswrapper[5094]: I0220 09:05:34.106509 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:05:34 crc kubenswrapper[5094]: I0220 09:05:34.107302 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:06:04 crc kubenswrapper[5094]: I0220 09:06:04.106460 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:06:04 crc kubenswrapper[5094]: I0220 09:06:04.107513 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.106690 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.107916 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.107991 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.108984 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.109064 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" gracePeriod=600 Feb 20 09:06:34 crc kubenswrapper[5094]: E0220 09:06:34.243438 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.248954 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" exitCode=0 Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.249008 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60"} Feb 20 09:06:34 crc kubenswrapper[5094]: I0220 09:06:34.249056 5094 scope.go:117] "RemoveContainer" containerID="8272b573c1c010339b8d8ae657303bad51b724d1c5f662b08fa0dc532f0ace33" Feb 20 09:06:35 crc kubenswrapper[5094]: I0220 09:06:35.263580 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:06:35 crc kubenswrapper[5094]: E0220 09:06:35.263874 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:06:49 crc kubenswrapper[5094]: I0220 09:06:49.841147 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:06:49 crc kubenswrapper[5094]: E0220 09:06:49.841921 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:07:02 crc kubenswrapper[5094]: I0220 09:07:02.840970 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:07:02 crc kubenswrapper[5094]: E0220 09:07:02.841857 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:07:17 crc kubenswrapper[5094]: I0220 09:07:17.841009 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:07:17 crc kubenswrapper[5094]: E0220 09:07:17.841986 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:07:21 crc kubenswrapper[5094]: I0220 09:07:21.711911 5094 generic.go:334] "Generic (PLEG): container finished" podID="30a55d13-2efe-4d90-bcef-14aedc741079" containerID="28b6a1153e01e932715ae5b8469ca02823b29ac957c328d37bd8a8b4033ff372" exitCode=0 Feb 20 09:07:21 crc kubenswrapper[5094]: I0220 09:07:21.711963 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" event={"ID":"30a55d13-2efe-4d90-bcef-14aedc741079","Type":"ContainerDied","Data":"28b6a1153e01e932715ae5b8469ca02823b29ac957c328d37bd8a8b4033ff372"} Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.155409 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.190612 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.190800 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.190892 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.190979 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.191017 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") pod \"30a55d13-2efe-4d90-bcef-14aedc741079\" (UID: \"30a55d13-2efe-4d90-bcef-14aedc741079\") " Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.197685 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph" (OuterVolumeSpecName: "ceph") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.197989 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.199939 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv" (OuterVolumeSpecName: "kube-api-access-hc2bv") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "kube-api-access-hc2bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.219963 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory" (OuterVolumeSpecName: "inventory") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.231939 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "30a55d13-2efe-4d90-bcef-14aedc741079" (UID: "30a55d13-2efe-4d90-bcef-14aedc741079"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294084 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294111 5094 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294125 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc2bv\" (UniqueName: \"kubernetes.io/projected/30a55d13-2efe-4d90-bcef-14aedc741079-kube-api-access-hc2bv\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294133 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.294145 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30a55d13-2efe-4d90-bcef-14aedc741079-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.729448 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" event={"ID":"30a55d13-2efe-4d90-bcef-14aedc741079","Type":"ContainerDied","Data":"b5119a9e3d1361f0b6bbf1a075495f30b7f1bf3071d0a3ad04118e1bd1708bb2"} Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.729807 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5119a9e3d1361f0b6bbf1a075495f30b7f1bf3071d0a3ad04118e1bd1708bb2" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.729500 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-m7mmg" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.838439 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-4zfxb"] Feb 20 09:07:23 crc kubenswrapper[5094]: E0220 09:07:23.839054 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a55d13-2efe-4d90-bcef-14aedc741079" containerName="bootstrap-openstack-openstack-cell1" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.839081 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a55d13-2efe-4d90-bcef-14aedc741079" containerName="bootstrap-openstack-openstack-cell1" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.839302 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a55d13-2efe-4d90-bcef-14aedc741079" containerName="bootstrap-openstack-openstack-cell1" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.840330 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.842940 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.843120 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.893280 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-4zfxb"] Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.905054 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.905120 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.905192 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:23 crc kubenswrapper[5094]: I0220 09:07:23.905290 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.007597 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.007688 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.007804 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.007926 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.013498 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.022080 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.022085 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.028248 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") pod \"download-cache-openstack-openstack-cell1-4zfxb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.173160 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.691995 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-4zfxb"] Feb 20 09:07:24 crc kubenswrapper[5094]: I0220 09:07:24.738942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" event={"ID":"ee932d3e-c52d-491d-92d8-8e21f7e1adbb","Type":"ContainerStarted","Data":"1d7eecaf4ffd0c7ac6f9163f2ee12164460f0f16736a4c9bfdbe1062f525b936"} Feb 20 09:07:25 crc kubenswrapper[5094]: I0220 09:07:25.749598 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" event={"ID":"ee932d3e-c52d-491d-92d8-8e21f7e1adbb","Type":"ContainerStarted","Data":"2ccd8439a62576ce780ceaa3d66e444205d4735fa06b8de40606cb4118f1de13"} Feb 20 09:07:25 crc kubenswrapper[5094]: I0220 09:07:25.769673 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" podStartSLOduration=2.239253361 podStartE2EDuration="2.769648361s" podCreationTimestamp="2026-02-20 09:07:23 +0000 UTC" firstStartedPulling="2026-02-20 09:07:24.703119463 +0000 UTC m=+8459.575746174" lastFinishedPulling="2026-02-20 09:07:25.233514463 +0000 UTC m=+8460.106141174" observedRunningTime="2026-02-20 09:07:25.767006318 +0000 UTC m=+8460.639633029" watchObservedRunningTime="2026-02-20 09:07:25.769648361 +0000 UTC m=+8460.642275072" Feb 20 09:07:31 crc kubenswrapper[5094]: I0220 09:07:31.840917 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:07:31 crc kubenswrapper[5094]: E0220 09:07:31.842004 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:07:34 crc kubenswrapper[5094]: I0220 09:07:34.843400 5094 generic.go:334] "Generic (PLEG): container finished" podID="e3baf01f-744b-44ed-b3c8-2ec288f77e59" containerID="d7b5776600a90eebbccedbaae5eb73db115ddeee382bcc00638e0b193233eedc" exitCode=0 Feb 20 09:07:34 crc kubenswrapper[5094]: I0220 09:07:34.843508 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-452ln" event={"ID":"e3baf01f-744b-44ed-b3c8-2ec288f77e59","Type":"ContainerDied","Data":"d7b5776600a90eebbccedbaae5eb73db115ddeee382bcc00638e0b193233eedc"} Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.365753 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.480011 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") pod \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.480060 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") pod \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.480092 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") pod \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.480208 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") pod \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\" (UID: \"e3baf01f-744b-44ed-b3c8-2ec288f77e59\") " Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.489898 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e3baf01f-744b-44ed-b3c8-2ec288f77e59" (UID: "e3baf01f-744b-44ed-b3c8-2ec288f77e59"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.490962 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm" (OuterVolumeSpecName: "kube-api-access-92jdm") pod "e3baf01f-744b-44ed-b3c8-2ec288f77e59" (UID: "e3baf01f-744b-44ed-b3c8-2ec288f77e59"). InnerVolumeSpecName "kube-api-access-92jdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.509395 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "e3baf01f-744b-44ed-b3c8-2ec288f77e59" (UID: "e3baf01f-744b-44ed-b3c8-2ec288f77e59"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.520399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory" (OuterVolumeSpecName: "inventory") pod "e3baf01f-744b-44ed-b3c8-2ec288f77e59" (UID: "e3baf01f-744b-44ed-b3c8-2ec288f77e59"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.583197 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92jdm\" (UniqueName: \"kubernetes.io/projected/e3baf01f-744b-44ed-b3c8-2ec288f77e59-kube-api-access-92jdm\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.583241 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.583253 5094 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.583265 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3baf01f-744b-44ed-b3c8-2ec288f77e59-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.867691 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-networker-452ln" event={"ID":"e3baf01f-744b-44ed-b3c8-2ec288f77e59","Type":"ContainerDied","Data":"c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4"} Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.867752 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4621d2d6bab418c3d74a80e75231eb8e087a215e3a060c1143e1dc140bb3ed4" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.867798 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-networker-452ln" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.926121 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-networker-s48qg"] Feb 20 09:07:36 crc kubenswrapper[5094]: E0220 09:07:36.926576 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3baf01f-744b-44ed-b3c8-2ec288f77e59" containerName="bootstrap-openstack-openstack-networker" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.926601 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3baf01f-744b-44ed-b3c8-2ec288f77e59" containerName="bootstrap-openstack-openstack-networker" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.926919 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3baf01f-744b-44ed-b3c8-2ec288f77e59" containerName="bootstrap-openstack-openstack-networker" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.927659 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.929843 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.943645 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.943951 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-s48qg"] Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.991593 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.991682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:36 crc kubenswrapper[5094]: I0220 09:07:36.991778 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.093686 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.093987 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.094147 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.099315 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.099586 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.119445 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") pod \"download-cache-openstack-openstack-networker-s48qg\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.249843 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.768947 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-networker-s48qg"] Feb 20 09:07:37 crc kubenswrapper[5094]: W0220 09:07:37.777581 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74755119_ad5b_439b_80bc_57779ffb5161.slice/crio-09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6 WatchSource:0}: Error finding container 09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6: Status 404 returned error can't find the container with id 09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6 Feb 20 09:07:37 crc kubenswrapper[5094]: I0220 09:07:37.879774 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-s48qg" event={"ID":"74755119-ad5b-439b-80bc-57779ffb5161","Type":"ContainerStarted","Data":"09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6"} Feb 20 09:07:38 crc kubenswrapper[5094]: I0220 09:07:38.888640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-s48qg" event={"ID":"74755119-ad5b-439b-80bc-57779ffb5161","Type":"ContainerStarted","Data":"c060370efba02fe88f084c71b2630e0d32026fa891d4b33faa34cb53fd72fa7d"} Feb 20 09:07:38 crc kubenswrapper[5094]: I0220 09:07:38.917365 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-networker-s48qg" podStartSLOduration=2.191177328 podStartE2EDuration="2.917339389s" podCreationTimestamp="2026-02-20 09:07:36 +0000 UTC" firstStartedPulling="2026-02-20 09:07:37.780868817 +0000 UTC m=+8472.653495518" lastFinishedPulling="2026-02-20 09:07:38.507030868 +0000 UTC m=+8473.379657579" observedRunningTime="2026-02-20 09:07:38.910773251 +0000 UTC m=+8473.783399972" watchObservedRunningTime="2026-02-20 09:07:38.917339389 +0000 UTC m=+8473.789966090" Feb 20 09:07:45 crc kubenswrapper[5094]: I0220 09:07:45.851974 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:07:45 crc kubenswrapper[5094]: E0220 09:07:45.853158 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:00 crc kubenswrapper[5094]: I0220 09:08:00.849825 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:00 crc kubenswrapper[5094]: E0220 09:08:00.851288 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:15 crc kubenswrapper[5094]: I0220 09:08:15.849238 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:15 crc kubenswrapper[5094]: E0220 09:08:15.850533 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:28 crc kubenswrapper[5094]: I0220 09:08:28.839959 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:28 crc kubenswrapper[5094]: E0220 09:08:28.841278 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:42 crc kubenswrapper[5094]: I0220 09:08:42.840840 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:42 crc kubenswrapper[5094]: E0220 09:08:42.843481 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:49 crc kubenswrapper[5094]: I0220 09:08:49.791220 5094 generic.go:334] "Generic (PLEG): container finished" podID="74755119-ad5b-439b-80bc-57779ffb5161" containerID="c060370efba02fe88f084c71b2630e0d32026fa891d4b33faa34cb53fd72fa7d" exitCode=0 Feb 20 09:08:49 crc kubenswrapper[5094]: I0220 09:08:49.791309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-s48qg" event={"ID":"74755119-ad5b-439b-80bc-57779ffb5161","Type":"ContainerDied","Data":"c060370efba02fe88f084c71b2630e0d32026fa891d4b33faa34cb53fd72fa7d"} Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.275476 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.387515 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") pod \"74755119-ad5b-439b-80bc-57779ffb5161\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.387596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") pod \"74755119-ad5b-439b-80bc-57779ffb5161\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.387717 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") pod \"74755119-ad5b-439b-80bc-57779ffb5161\" (UID: \"74755119-ad5b-439b-80bc-57779ffb5161\") " Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.399416 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc" (OuterVolumeSpecName: "kube-api-access-m76cc") pod "74755119-ad5b-439b-80bc-57779ffb5161" (UID: "74755119-ad5b-439b-80bc-57779ffb5161"). InnerVolumeSpecName "kube-api-access-m76cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.430979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory" (OuterVolumeSpecName: "inventory") pod "74755119-ad5b-439b-80bc-57779ffb5161" (UID: "74755119-ad5b-439b-80bc-57779ffb5161"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.471870 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "74755119-ad5b-439b-80bc-57779ffb5161" (UID: "74755119-ad5b-439b-80bc-57779ffb5161"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.493462 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m76cc\" (UniqueName: \"kubernetes.io/projected/74755119-ad5b-439b-80bc-57779ffb5161-kube-api-access-m76cc\") on node \"crc\" DevicePath \"\"" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.493520 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.493539 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/74755119-ad5b-439b-80bc-57779ffb5161-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.807234 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-networker-s48qg" event={"ID":"74755119-ad5b-439b-80bc-57779ffb5161","Type":"ContainerDied","Data":"09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6"} Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.807463 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d4d203520efc0360ec5324bd63b47e8902e7a3f5ee93502961755f99e4fdd6" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.807325 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-networker-s48qg" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.891294 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-networker-59p5g"] Feb 20 09:08:51 crc kubenswrapper[5094]: E0220 09:08:51.892071 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74755119-ad5b-439b-80bc-57779ffb5161" containerName="download-cache-openstack-openstack-networker" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.892164 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="74755119-ad5b-439b-80bc-57779ffb5161" containerName="download-cache-openstack-openstack-networker" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.892531 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="74755119-ad5b-439b-80bc-57779ffb5161" containerName="download-cache-openstack-openstack-networker" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.893571 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.897520 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.898028 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:08:51 crc kubenswrapper[5094]: I0220 09:08:51.916168 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-59p5g"] Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.002154 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.002228 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.002608 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.104759 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.104859 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.104999 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.109671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.110053 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.125847 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") pod \"configure-network-openstack-openstack-networker-59p5g\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.232366 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.812473 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-networker-59p5g"] Feb 20 09:08:52 crc kubenswrapper[5094]: I0220 09:08:52.816745 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-59p5g" event={"ID":"de84413a-d424-4ec1-bb6d-e91b2278b854","Type":"ContainerStarted","Data":"ebb40d52986a5e8028a0edd2009cbe5bef1de50e2634f808b231951cf13031cc"} Feb 20 09:08:53 crc kubenswrapper[5094]: I0220 09:08:53.834251 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-59p5g" event={"ID":"de84413a-d424-4ec1-bb6d-e91b2278b854","Type":"ContainerStarted","Data":"5fe5bfe654a87fe11ab6e48fba7983d6c8f9cb8b7a0de1d97571c5ae01535c98"} Feb 20 09:08:53 crc kubenswrapper[5094]: I0220 09:08:53.844162 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:08:53 crc kubenswrapper[5094]: E0220 09:08:53.844412 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:08:53 crc kubenswrapper[5094]: I0220 09:08:53.850681 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-networker-59p5g" podStartSLOduration=2.319668678 podStartE2EDuration="2.850664242s" podCreationTimestamp="2026-02-20 09:08:51 +0000 UTC" firstStartedPulling="2026-02-20 09:08:52.808170552 +0000 UTC m=+8547.680797263" lastFinishedPulling="2026-02-20 09:08:53.339166126 +0000 UTC m=+8548.211792827" observedRunningTime="2026-02-20 09:08:53.849476213 +0000 UTC m=+8548.722102924" watchObservedRunningTime="2026-02-20 09:08:53.850664242 +0000 UTC m=+8548.723290953" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.661932 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.676670 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.676851 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.737646 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.737833 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.737871 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.839524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.839639 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.839675 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.841151 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.841234 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.859409 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") pod \"redhat-marketplace-l222c\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:01 crc kubenswrapper[5094]: I0220 09:09:01.998617 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:02 crc kubenswrapper[5094]: I0220 09:09:02.470743 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:02 crc kubenswrapper[5094]: W0220 09:09:02.478259 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8429a7_c662_4ac3_968f_306eb9b3ba0e.slice/crio-6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423 WatchSource:0}: Error finding container 6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423: Status 404 returned error can't find the container with id 6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423 Feb 20 09:09:02 crc kubenswrapper[5094]: I0220 09:09:02.937858 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerID="1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4" exitCode=0 Feb 20 09:09:02 crc kubenswrapper[5094]: I0220 09:09:02.937942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerDied","Data":"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4"} Feb 20 09:09:02 crc kubenswrapper[5094]: I0220 09:09:02.938244 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerStarted","Data":"6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423"} Feb 20 09:09:04 crc kubenswrapper[5094]: I0220 09:09:04.956363 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerStarted","Data":"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e"} Feb 20 09:09:05 crc kubenswrapper[5094]: I0220 09:09:05.969388 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerID="8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e" exitCode=0 Feb 20 09:09:05 crc kubenswrapper[5094]: I0220 09:09:05.969544 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerDied","Data":"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e"} Feb 20 09:09:06 crc kubenswrapper[5094]: I0220 09:09:06.840320 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:06 crc kubenswrapper[5094]: E0220 09:09:06.841232 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:06 crc kubenswrapper[5094]: I0220 09:09:06.981299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerStarted","Data":"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319"} Feb 20 09:09:07 crc kubenswrapper[5094]: I0220 09:09:07.003589 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l222c" podStartSLOduration=2.57991436 podStartE2EDuration="6.003568477s" podCreationTimestamp="2026-02-20 09:09:01 +0000 UTC" firstStartedPulling="2026-02-20 09:09:02.940794064 +0000 UTC m=+8557.813420775" lastFinishedPulling="2026-02-20 09:09:06.364448181 +0000 UTC m=+8561.237074892" observedRunningTime="2026-02-20 09:09:07.000290868 +0000 UTC m=+8561.872917589" watchObservedRunningTime="2026-02-20 09:09:07.003568477 +0000 UTC m=+8561.876195188" Feb 20 09:09:11 crc kubenswrapper[5094]: I0220 09:09:11.999101 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:12 crc kubenswrapper[5094]: I0220 09:09:11.999973 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:12 crc kubenswrapper[5094]: I0220 09:09:12.061080 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:12 crc kubenswrapper[5094]: I0220 09:09:12.149745 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:12 crc kubenswrapper[5094]: I0220 09:09:12.307665 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.060100 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l222c" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="registry-server" containerID="cri-o://0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" gracePeriod=2 Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.589877 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.734078 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") pod \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.734223 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") pod \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.735073 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") pod \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\" (UID: \"ac8429a7-c662-4ac3-968f-306eb9b3ba0e\") " Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.735101 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities" (OuterVolumeSpecName: "utilities") pod "ac8429a7-c662-4ac3-968f-306eb9b3ba0e" (UID: "ac8429a7-c662-4ac3-968f-306eb9b3ba0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.737031 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.741928 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967" (OuterVolumeSpecName: "kube-api-access-4p967") pod "ac8429a7-c662-4ac3-968f-306eb9b3ba0e" (UID: "ac8429a7-c662-4ac3-968f-306eb9b3ba0e"). InnerVolumeSpecName "kube-api-access-4p967". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.758946 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac8429a7-c662-4ac3-968f-306eb9b3ba0e" (UID: "ac8429a7-c662-4ac3-968f-306eb9b3ba0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.841668 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p967\" (UniqueName: \"kubernetes.io/projected/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-kube-api-access-4p967\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:14 crc kubenswrapper[5094]: I0220 09:09:14.842240 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac8429a7-c662-4ac3-968f-306eb9b3ba0e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073227 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerID="0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" exitCode=0 Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073261 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerDied","Data":"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319"} Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073312 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l222c" event={"ID":"ac8429a7-c662-4ac3-968f-306eb9b3ba0e","Type":"ContainerDied","Data":"6efdaf2b79ccf0b98669133f00a3c1ce0a38fadc1e5961435ff80a8123217423"} Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073336 5094 scope.go:117] "RemoveContainer" containerID="0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.073342 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l222c" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.112144 5094 scope.go:117] "RemoveContainer" containerID="8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.113306 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.123319 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l222c"] Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.177890 5094 scope.go:117] "RemoveContainer" containerID="1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.208772 5094 scope.go:117] "RemoveContainer" containerID="0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" Feb 20 09:09:15 crc kubenswrapper[5094]: E0220 09:09:15.209486 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319\": container with ID starting with 0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319 not found: ID does not exist" containerID="0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.209515 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319"} err="failed to get container status \"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319\": rpc error: code = NotFound desc = could not find container \"0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319\": container with ID starting with 0a3417a44acb9e5244cd206a80346319bb6878cf7514c126c67952a4b16ce319 not found: ID does not exist" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.209538 5094 scope.go:117] "RemoveContainer" containerID="8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e" Feb 20 09:09:15 crc kubenswrapper[5094]: E0220 09:09:15.210160 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e\": container with ID starting with 8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e not found: ID does not exist" containerID="8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.210247 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e"} err="failed to get container status \"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e\": rpc error: code = NotFound desc = could not find container \"8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e\": container with ID starting with 8862aef8bd542f54b4f796b48d4129769da77b8bcfb351000b16657e37142b9e not found: ID does not exist" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.210297 5094 scope.go:117] "RemoveContainer" containerID="1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4" Feb 20 09:09:15 crc kubenswrapper[5094]: E0220 09:09:15.210890 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4\": container with ID starting with 1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4 not found: ID does not exist" containerID="1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.210932 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4"} err="failed to get container status \"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4\": rpc error: code = NotFound desc = could not find container \"1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4\": container with ID starting with 1a3ded7a99dde04c7ca9f20546fcc39238fb662005072e7c91e1309efa761dd4 not found: ID does not exist" Feb 20 09:09:15 crc kubenswrapper[5094]: I0220 09:09:15.851694 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" path="/var/lib/kubelet/pods/ac8429a7-c662-4ac3-968f-306eb9b3ba0e/volumes" Feb 20 09:09:18 crc kubenswrapper[5094]: I0220 09:09:18.841337 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:18 crc kubenswrapper[5094]: E0220 09:09:18.842158 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:31 crc kubenswrapper[5094]: I0220 09:09:31.841094 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:31 crc kubenswrapper[5094]: E0220 09:09:31.842133 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:32 crc kubenswrapper[5094]: I0220 09:09:32.278433 5094 generic.go:334] "Generic (PLEG): container finished" podID="ee932d3e-c52d-491d-92d8-8e21f7e1adbb" containerID="2ccd8439a62576ce780ceaa3d66e444205d4735fa06b8de40606cb4118f1de13" exitCode=0 Feb 20 09:09:32 crc kubenswrapper[5094]: I0220 09:09:32.278488 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" event={"ID":"ee932d3e-c52d-491d-92d8-8e21f7e1adbb","Type":"ContainerDied","Data":"2ccd8439a62576ce780ceaa3d66e444205d4735fa06b8de40606cb4118f1de13"} Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.766783 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.879364 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") pod \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.879546 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") pod \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.879619 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") pod \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.879670 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") pod \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\" (UID: \"ee932d3e-c52d-491d-92d8-8e21f7e1adbb\") " Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.896982 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph" (OuterVolumeSpecName: "ceph") pod "ee932d3e-c52d-491d-92d8-8e21f7e1adbb" (UID: "ee932d3e-c52d-491d-92d8-8e21f7e1adbb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.897006 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr" (OuterVolumeSpecName: "kube-api-access-wzdxr") pod "ee932d3e-c52d-491d-92d8-8e21f7e1adbb" (UID: "ee932d3e-c52d-491d-92d8-8e21f7e1adbb"). InnerVolumeSpecName "kube-api-access-wzdxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.909312 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ee932d3e-c52d-491d-92d8-8e21f7e1adbb" (UID: "ee932d3e-c52d-491d-92d8-8e21f7e1adbb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.911280 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory" (OuterVolumeSpecName: "inventory") pod "ee932d3e-c52d-491d-92d8-8e21f7e1adbb" (UID: "ee932d3e-c52d-491d-92d8-8e21f7e1adbb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.982108 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.982181 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.982196 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzdxr\" (UniqueName: \"kubernetes.io/projected/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-kube-api-access-wzdxr\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:33 crc kubenswrapper[5094]: I0220 09:09:33.982211 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ee932d3e-c52d-491d-92d8-8e21f7e1adbb-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.300071 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" event={"ID":"ee932d3e-c52d-491d-92d8-8e21f7e1adbb","Type":"ContainerDied","Data":"1d7eecaf4ffd0c7ac6f9163f2ee12164460f0f16736a4c9bfdbe1062f525b936"} Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.300119 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d7eecaf4ffd0c7ac6f9163f2ee12164460f0f16736a4c9bfdbe1062f525b936" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.300193 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-4zfxb" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.375675 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-ggj5f"] Feb 20 09:09:34 crc kubenswrapper[5094]: E0220 09:09:34.376318 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="extract-utilities" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376381 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="extract-utilities" Feb 20 09:09:34 crc kubenswrapper[5094]: E0220 09:09:34.376454 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="registry-server" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376509 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="registry-server" Feb 20 09:09:34 crc kubenswrapper[5094]: E0220 09:09:34.376576 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee932d3e-c52d-491d-92d8-8e21f7e1adbb" containerName="download-cache-openstack-openstack-cell1" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376629 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee932d3e-c52d-491d-92d8-8e21f7e1adbb" containerName="download-cache-openstack-openstack-cell1" Feb 20 09:09:34 crc kubenswrapper[5094]: E0220 09:09:34.376685 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="extract-content" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376757 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="extract-content" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.376987 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee932d3e-c52d-491d-92d8-8e21f7e1adbb" containerName="download-cache-openstack-openstack-cell1" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.377063 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8429a7-c662-4ac3-968f-306eb9b3ba0e" containerName="registry-server" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.377788 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.379948 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.380356 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.414811 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-ggj5f"] Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.491381 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.491456 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.491502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.491639 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.593343 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.593407 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.593440 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.593536 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.596549 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.596677 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.597143 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.618406 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") pod \"configure-network-openstack-openstack-cell1-ggj5f\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:34 crc kubenswrapper[5094]: I0220 09:09:34.709058 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:09:35 crc kubenswrapper[5094]: I0220 09:09:35.325835 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-ggj5f"] Feb 20 09:09:35 crc kubenswrapper[5094]: I0220 09:09:35.332771 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:09:36 crc kubenswrapper[5094]: I0220 09:09:36.346200 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" event={"ID":"ea277d62-feb7-40a2-80a9-ad1a9d82cb13","Type":"ContainerStarted","Data":"ac9dc0ee68deb94a4ff5312e0a939b0b139cabb294d3a2db6521599467eefd6e"} Feb 20 09:09:36 crc kubenswrapper[5094]: I0220 09:09:36.346768 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" event={"ID":"ea277d62-feb7-40a2-80a9-ad1a9d82cb13","Type":"ContainerStarted","Data":"77532f6a8f46591197d08ef5699cbaa0638e65ff7cf2caf3ef1ddf35fd1e6aa0"} Feb 20 09:09:36 crc kubenswrapper[5094]: I0220 09:09:36.384025 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" podStartSLOduration=1.960952687 podStartE2EDuration="2.384005055s" podCreationTimestamp="2026-02-20 09:09:34 +0000 UTC" firstStartedPulling="2026-02-20 09:09:35.332499048 +0000 UTC m=+8590.205125759" lastFinishedPulling="2026-02-20 09:09:35.755551396 +0000 UTC m=+8590.628178127" observedRunningTime="2026-02-20 09:09:36.375110082 +0000 UTC m=+8591.247736793" watchObservedRunningTime="2026-02-20 09:09:36.384005055 +0000 UTC m=+8591.256631756" Feb 20 09:09:44 crc kubenswrapper[5094]: I0220 09:09:44.842068 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:44 crc kubenswrapper[5094]: E0220 09:09:44.842828 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:55 crc kubenswrapper[5094]: I0220 09:09:55.545600 5094 generic.go:334] "Generic (PLEG): container finished" podID="de84413a-d424-4ec1-bb6d-e91b2278b854" containerID="5fe5bfe654a87fe11ab6e48fba7983d6c8f9cb8b7a0de1d97571c5ae01535c98" exitCode=0 Feb 20 09:09:55 crc kubenswrapper[5094]: I0220 09:09:55.546383 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-59p5g" event={"ID":"de84413a-d424-4ec1-bb6d-e91b2278b854","Type":"ContainerDied","Data":"5fe5bfe654a87fe11ab6e48fba7983d6c8f9cb8b7a0de1d97571c5ae01535c98"} Feb 20 09:09:56 crc kubenswrapper[5094]: I0220 09:09:56.841762 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:09:56 crc kubenswrapper[5094]: E0220 09:09:56.843252 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.131525 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.321904 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") pod \"de84413a-d424-4ec1-bb6d-e91b2278b854\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.322446 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") pod \"de84413a-d424-4ec1-bb6d-e91b2278b854\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.322516 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") pod \"de84413a-d424-4ec1-bb6d-e91b2278b854\" (UID: \"de84413a-d424-4ec1-bb6d-e91b2278b854\") " Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.328409 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4" (OuterVolumeSpecName: "kube-api-access-mrbg4") pod "de84413a-d424-4ec1-bb6d-e91b2278b854" (UID: "de84413a-d424-4ec1-bb6d-e91b2278b854"). InnerVolumeSpecName "kube-api-access-mrbg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.358026 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory" (OuterVolumeSpecName: "inventory") pod "de84413a-d424-4ec1-bb6d-e91b2278b854" (UID: "de84413a-d424-4ec1-bb6d-e91b2278b854"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.361794 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "de84413a-d424-4ec1-bb6d-e91b2278b854" (UID: "de84413a-d424-4ec1-bb6d-e91b2278b854"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.425983 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.426047 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrbg4\" (UniqueName: \"kubernetes.io/projected/de84413a-d424-4ec1-bb6d-e91b2278b854-kube-api-access-mrbg4\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.426076 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/de84413a-d424-4ec1-bb6d-e91b2278b854-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.566067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-networker-59p5g" event={"ID":"de84413a-d424-4ec1-bb6d-e91b2278b854","Type":"ContainerDied","Data":"ebb40d52986a5e8028a0edd2009cbe5bef1de50e2634f808b231951cf13031cc"} Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.566107 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebb40d52986a5e8028a0edd2009cbe5bef1de50e2634f808b231951cf13031cc" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.566163 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-networker-59p5g" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.674466 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-networker-2pr7n"] Feb 20 09:09:57 crc kubenswrapper[5094]: E0220 09:09:57.675062 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de84413a-d424-4ec1-bb6d-e91b2278b854" containerName="configure-network-openstack-openstack-networker" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.675088 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="de84413a-d424-4ec1-bb6d-e91b2278b854" containerName="configure-network-openstack-openstack-networker" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.675380 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="de84413a-d424-4ec1-bb6d-e91b2278b854" containerName="configure-network-openstack-openstack-networker" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.676326 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.680048 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.680107 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.689495 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-2pr7n"] Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.834752 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.834972 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.835197 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.938023 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.938132 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.938252 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.943835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.946054 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.956159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") pod \"validate-network-openstack-openstack-networker-2pr7n\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:57 crc kubenswrapper[5094]: I0220 09:09:57.993777 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:09:58 crc kubenswrapper[5094]: I0220 09:09:58.561900 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-networker-2pr7n"] Feb 20 09:09:58 crc kubenswrapper[5094]: W0220 09:09:58.572271 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf04426ab_50e7_4345_842b_69bfcc58207c.slice/crio-9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701 WatchSource:0}: Error finding container 9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701: Status 404 returned error can't find the container with id 9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701 Feb 20 09:09:59 crc kubenswrapper[5094]: I0220 09:09:59.592039 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" event={"ID":"f04426ab-50e7-4345-842b-69bfcc58207c","Type":"ContainerStarted","Data":"8fa8add69c953aa8c865ae40971a37100de7a516ed62f628f7fee2af28f45d53"} Feb 20 09:09:59 crc kubenswrapper[5094]: I0220 09:09:59.592592 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" event={"ID":"f04426ab-50e7-4345-842b-69bfcc58207c","Type":"ContainerStarted","Data":"9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701"} Feb 20 09:09:59 crc kubenswrapper[5094]: I0220 09:09:59.620394 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" podStartSLOduration=2.022348811 podStartE2EDuration="2.620375358s" podCreationTimestamp="2026-02-20 09:09:57 +0000 UTC" firstStartedPulling="2026-02-20 09:09:58.574586698 +0000 UTC m=+8613.447213409" lastFinishedPulling="2026-02-20 09:09:59.172613245 +0000 UTC m=+8614.045239956" observedRunningTime="2026-02-20 09:09:59.613914812 +0000 UTC m=+8614.486541523" watchObservedRunningTime="2026-02-20 09:09:59.620375358 +0000 UTC m=+8614.493002059" Feb 20 09:10:04 crc kubenswrapper[5094]: I0220 09:10:04.652350 5094 generic.go:334] "Generic (PLEG): container finished" podID="f04426ab-50e7-4345-842b-69bfcc58207c" containerID="8fa8add69c953aa8c865ae40971a37100de7a516ed62f628f7fee2af28f45d53" exitCode=0 Feb 20 09:10:04 crc kubenswrapper[5094]: I0220 09:10:04.652393 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" event={"ID":"f04426ab-50e7-4345-842b-69bfcc58207c","Type":"ContainerDied","Data":"8fa8add69c953aa8c865ae40971a37100de7a516ed62f628f7fee2af28f45d53"} Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.171099 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.341678 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") pod \"f04426ab-50e7-4345-842b-69bfcc58207c\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.341774 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") pod \"f04426ab-50e7-4345-842b-69bfcc58207c\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.341972 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") pod \"f04426ab-50e7-4345-842b-69bfcc58207c\" (UID: \"f04426ab-50e7-4345-842b-69bfcc58207c\") " Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.348694 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k" (OuterVolumeSpecName: "kube-api-access-mcj8k") pod "f04426ab-50e7-4345-842b-69bfcc58207c" (UID: "f04426ab-50e7-4345-842b-69bfcc58207c"). InnerVolumeSpecName "kube-api-access-mcj8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.378217 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory" (OuterVolumeSpecName: "inventory") pod "f04426ab-50e7-4345-842b-69bfcc58207c" (UID: "f04426ab-50e7-4345-842b-69bfcc58207c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.388002 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "f04426ab-50e7-4345-842b-69bfcc58207c" (UID: "f04426ab-50e7-4345-842b-69bfcc58207c"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.444302 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.444331 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcj8k\" (UniqueName: \"kubernetes.io/projected/f04426ab-50e7-4345-842b-69bfcc58207c-kube-api-access-mcj8k\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.444343 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/f04426ab-50e7-4345-842b-69bfcc58207c-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.678400 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" event={"ID":"f04426ab-50e7-4345-842b-69bfcc58207c","Type":"ContainerDied","Data":"9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701"} Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.678442 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c352daadefca118725443df28295e4c02048fade4ca2f92eba2b761b7f9c701" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.678599 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-networker-2pr7n" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.761498 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-networker-tqhtr"] Feb 20 09:10:06 crc kubenswrapper[5094]: E0220 09:10:06.762055 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f04426ab-50e7-4345-842b-69bfcc58207c" containerName="validate-network-openstack-openstack-networker" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.762084 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f04426ab-50e7-4345-842b-69bfcc58207c" containerName="validate-network-openstack-openstack-networker" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.762319 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f04426ab-50e7-4345-842b-69bfcc58207c" containerName="validate-network-openstack-openstack-networker" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.763131 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.766150 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.766436 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.775176 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-tqhtr"] Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.954794 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.955076 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:06 crc kubenswrapper[5094]: I0220 09:10:06.955178 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.056727 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.057230 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.058112 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.063662 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.064201 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.073338 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") pod \"install-os-openstack-openstack-networker-tqhtr\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.082181 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.640322 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-networker-tqhtr"] Feb 20 09:10:07 crc kubenswrapper[5094]: I0220 09:10:07.696046 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-tqhtr" event={"ID":"e890bf4c-20ec-4b45-936d-d08d3a73b5ee","Type":"ContainerStarted","Data":"baa0af510f31ddefaf054818af5e9e587e1051876dcf44574d86835d0dea5bbc"} Feb 20 09:10:08 crc kubenswrapper[5094]: I0220 09:10:08.710882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-tqhtr" event={"ID":"e890bf4c-20ec-4b45-936d-d08d3a73b5ee","Type":"ContainerStarted","Data":"53d37e4df512781460481b64b9d79b108f7fa1fdc52b31081e79c7d0aef5a644"} Feb 20 09:10:08 crc kubenswrapper[5094]: I0220 09:10:08.749440 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-networker-tqhtr" podStartSLOduration=2.269640922 podStartE2EDuration="2.749409865s" podCreationTimestamp="2026-02-20 09:10:06 +0000 UTC" firstStartedPulling="2026-02-20 09:10:07.638257062 +0000 UTC m=+8622.510883783" lastFinishedPulling="2026-02-20 09:10:08.118025975 +0000 UTC m=+8622.990652726" observedRunningTime="2026-02-20 09:10:08.739992738 +0000 UTC m=+8623.612619489" watchObservedRunningTime="2026-02-20 09:10:08.749409865 +0000 UTC m=+8623.622036616" Feb 20 09:10:10 crc kubenswrapper[5094]: I0220 09:10:10.840306 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:10 crc kubenswrapper[5094]: E0220 09:10:10.840843 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:21 crc kubenswrapper[5094]: I0220 09:10:21.841271 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:21 crc kubenswrapper[5094]: E0220 09:10:21.842177 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:33 crc kubenswrapper[5094]: I0220 09:10:33.840293 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:33 crc kubenswrapper[5094]: E0220 09:10:33.841100 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:34 crc kubenswrapper[5094]: I0220 09:10:34.942317 5094 generic.go:334] "Generic (PLEG): container finished" podID="ea277d62-feb7-40a2-80a9-ad1a9d82cb13" containerID="ac9dc0ee68deb94a4ff5312e0a939b0b139cabb294d3a2db6521599467eefd6e" exitCode=0 Feb 20 09:10:34 crc kubenswrapper[5094]: I0220 09:10:34.942623 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" event={"ID":"ea277d62-feb7-40a2-80a9-ad1a9d82cb13","Type":"ContainerDied","Data":"ac9dc0ee68deb94a4ff5312e0a939b0b139cabb294d3a2db6521599467eefd6e"} Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.395087 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.503913 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") pod \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.503964 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") pod \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.504056 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") pod \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.504153 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") pod \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\" (UID: \"ea277d62-feb7-40a2-80a9-ad1a9d82cb13\") " Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.510035 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph" (OuterVolumeSpecName: "ceph") pod "ea277d62-feb7-40a2-80a9-ad1a9d82cb13" (UID: "ea277d62-feb7-40a2-80a9-ad1a9d82cb13"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.510074 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4" (OuterVolumeSpecName: "kube-api-access-tgvc4") pod "ea277d62-feb7-40a2-80a9-ad1a9d82cb13" (UID: "ea277d62-feb7-40a2-80a9-ad1a9d82cb13"). InnerVolumeSpecName "kube-api-access-tgvc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.538898 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory" (OuterVolumeSpecName: "inventory") pod "ea277d62-feb7-40a2-80a9-ad1a9d82cb13" (UID: "ea277d62-feb7-40a2-80a9-ad1a9d82cb13"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.542976 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ea277d62-feb7-40a2-80a9-ad1a9d82cb13" (UID: "ea277d62-feb7-40a2-80a9-ad1a9d82cb13"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.606348 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.606406 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgvc4\" (UniqueName: \"kubernetes.io/projected/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-kube-api-access-tgvc4\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.606418 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.606427 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ea277d62-feb7-40a2-80a9-ad1a9d82cb13-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.962019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" event={"ID":"ea277d62-feb7-40a2-80a9-ad1a9d82cb13","Type":"ContainerDied","Data":"77532f6a8f46591197d08ef5699cbaa0638e65ff7cf2caf3ef1ddf35fd1e6aa0"} Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.962060 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77532f6a8f46591197d08ef5699cbaa0638e65ff7cf2caf3ef1ddf35fd1e6aa0" Feb 20 09:10:36 crc kubenswrapper[5094]: I0220 09:10:36.962110 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-ggj5f" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.046611 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-k27rr"] Feb 20 09:10:37 crc kubenswrapper[5094]: E0220 09:10:37.047348 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea277d62-feb7-40a2-80a9-ad1a9d82cb13" containerName="configure-network-openstack-openstack-cell1" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.047454 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea277d62-feb7-40a2-80a9-ad1a9d82cb13" containerName="configure-network-openstack-openstack-cell1" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.047719 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea277d62-feb7-40a2-80a9-ad1a9d82cb13" containerName="configure-network-openstack-openstack-cell1" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.048507 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.051844 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.052640 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.058593 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-k27rr"] Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.220185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.220293 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.221601 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.221667 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.325538 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.325859 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.325928 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.326142 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.332284 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.335138 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.335529 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.346456 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") pod \"validate-network-openstack-openstack-cell1-k27rr\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.376224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.900318 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-k27rr"] Feb 20 09:10:37 crc kubenswrapper[5094]: I0220 09:10:37.970643 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" event={"ID":"61a917df-8faa-482f-9582-0c5737301057","Type":"ContainerStarted","Data":"eef06862041ecff1ebe280bbf7209ac78f7908fce316f1b44f1b738e154e8e8c"} Feb 20 09:10:38 crc kubenswrapper[5094]: I0220 09:10:38.981341 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" event={"ID":"61a917df-8faa-482f-9582-0c5737301057","Type":"ContainerStarted","Data":"d5f2d5bed405ecbd984035db7e835305c214286b97a01996788d3dabeb541dbf"} Feb 20 09:10:39 crc kubenswrapper[5094]: I0220 09:10:39.002049 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" podStartSLOduration=1.479114225 podStartE2EDuration="2.002028055s" podCreationTimestamp="2026-02-20 09:10:37 +0000 UTC" firstStartedPulling="2026-02-20 09:10:37.905861923 +0000 UTC m=+8652.778488634" lastFinishedPulling="2026-02-20 09:10:38.428775753 +0000 UTC m=+8653.301402464" observedRunningTime="2026-02-20 09:10:38.995348844 +0000 UTC m=+8653.867975575" watchObservedRunningTime="2026-02-20 09:10:39.002028055 +0000 UTC m=+8653.874654766" Feb 20 09:10:44 crc kubenswrapper[5094]: I0220 09:10:44.840929 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:44 crc kubenswrapper[5094]: E0220 09:10:44.842524 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:46 crc kubenswrapper[5094]: I0220 09:10:46.045373 5094 generic.go:334] "Generic (PLEG): container finished" podID="61a917df-8faa-482f-9582-0c5737301057" containerID="d5f2d5bed405ecbd984035db7e835305c214286b97a01996788d3dabeb541dbf" exitCode=0 Feb 20 09:10:46 crc kubenswrapper[5094]: I0220 09:10:46.045471 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" event={"ID":"61a917df-8faa-482f-9582-0c5737301057","Type":"ContainerDied","Data":"d5f2d5bed405ecbd984035db7e835305c214286b97a01996788d3dabeb541dbf"} Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.502643 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.651089 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") pod \"61a917df-8faa-482f-9582-0c5737301057\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.651181 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") pod \"61a917df-8faa-482f-9582-0c5737301057\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.651217 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") pod \"61a917df-8faa-482f-9582-0c5737301057\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.651247 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") pod \"61a917df-8faa-482f-9582-0c5737301057\" (UID: \"61a917df-8faa-482f-9582-0c5737301057\") " Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.656485 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz" (OuterVolumeSpecName: "kube-api-access-ktjhz") pod "61a917df-8faa-482f-9582-0c5737301057" (UID: "61a917df-8faa-482f-9582-0c5737301057"). InnerVolumeSpecName "kube-api-access-ktjhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.656874 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph" (OuterVolumeSpecName: "ceph") pod "61a917df-8faa-482f-9582-0c5737301057" (UID: "61a917df-8faa-482f-9582-0c5737301057"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.678433 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory" (OuterVolumeSpecName: "inventory") pod "61a917df-8faa-482f-9582-0c5737301057" (UID: "61a917df-8faa-482f-9582-0c5737301057"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.686166 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "61a917df-8faa-482f-9582-0c5737301057" (UID: "61a917df-8faa-482f-9582-0c5737301057"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.754072 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.754393 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktjhz\" (UniqueName: \"kubernetes.io/projected/61a917df-8faa-482f-9582-0c5737301057-kube-api-access-ktjhz\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.754403 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:47 crc kubenswrapper[5094]: I0220 09:10:47.754412 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/61a917df-8faa-482f-9582-0c5737301057-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.065943 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" event={"ID":"61a917df-8faa-482f-9582-0c5737301057","Type":"ContainerDied","Data":"eef06862041ecff1ebe280bbf7209ac78f7908fce316f1b44f1b738e154e8e8c"} Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.066002 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eef06862041ecff1ebe280bbf7209ac78f7908fce316f1b44f1b738e154e8e8c" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.066008 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-k27rr" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.181012 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-8j6nf"] Feb 20 09:10:48 crc kubenswrapper[5094]: E0220 09:10:48.181556 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a917df-8faa-482f-9582-0c5737301057" containerName="validate-network-openstack-openstack-cell1" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.181580 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a917df-8faa-482f-9582-0c5737301057" containerName="validate-network-openstack-openstack-cell1" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.181884 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a917df-8faa-482f-9582-0c5737301057" containerName="validate-network-openstack-openstack-cell1" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.182982 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.185499 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.192589 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.193797 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-8j6nf"] Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.269446 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.269537 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.269581 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.269666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.371738 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.372112 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.372164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.372205 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.379324 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.390258 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.390625 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.392938 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") pod \"install-os-openstack-openstack-cell1-8j6nf\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:48 crc kubenswrapper[5094]: I0220 09:10:48.550543 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:10:49 crc kubenswrapper[5094]: I0220 09:10:49.239278 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-8j6nf"] Feb 20 09:10:50 crc kubenswrapper[5094]: I0220 09:10:50.083554 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" event={"ID":"27e4bec3-7ef3-4f1d-897d-99909f817f5e","Type":"ContainerStarted","Data":"2213442e38aa4ed71dfd4b1cb999e22f654d497f4fc2fb09a72fef9b37908a01"} Feb 20 09:10:51 crc kubenswrapper[5094]: I0220 09:10:51.093994 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" event={"ID":"27e4bec3-7ef3-4f1d-897d-99909f817f5e","Type":"ContainerStarted","Data":"7453b73c41008eee230417fe9bbf595522244c350a0fb1c6a70aacaecfc7b2af"} Feb 20 09:10:51 crc kubenswrapper[5094]: I0220 09:10:51.116867 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" podStartSLOduration=2.22922089 podStartE2EDuration="3.116845884s" podCreationTimestamp="2026-02-20 09:10:48 +0000 UTC" firstStartedPulling="2026-02-20 09:10:49.253770543 +0000 UTC m=+8664.126397254" lastFinishedPulling="2026-02-20 09:10:50.141395527 +0000 UTC m=+8665.014022248" observedRunningTime="2026-02-20 09:10:51.112175442 +0000 UTC m=+8665.984802173" watchObservedRunningTime="2026-02-20 09:10:51.116845884 +0000 UTC m=+8665.989472605" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.573432 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.576382 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.623066 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.623635 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.623682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.633779 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.742992 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.743374 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.743578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.744501 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.744819 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.801671 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") pod \"redhat-operators-s8vl2\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:54 crc kubenswrapper[5094]: I0220 09:10:54.892928 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:10:55 crc kubenswrapper[5094]: I0220 09:10:55.374457 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:10:55 crc kubenswrapper[5094]: I0220 09:10:55.846904 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:10:55 crc kubenswrapper[5094]: E0220 09:10:55.848906 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:10:56 crc kubenswrapper[5094]: I0220 09:10:56.146531 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerID="a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8" exitCode=0 Feb 20 09:10:56 crc kubenswrapper[5094]: I0220 09:10:56.146629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerDied","Data":"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8"} Feb 20 09:10:56 crc kubenswrapper[5094]: I0220 09:10:56.146966 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerStarted","Data":"59fc4a663a25aaeb18ed744c2850733f6000d6cb2136491a9705a24d5fd293ba"} Feb 20 09:10:58 crc kubenswrapper[5094]: I0220 09:10:58.174970 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerStarted","Data":"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd"} Feb 20 09:11:00 crc kubenswrapper[5094]: I0220 09:11:00.195343 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerID="c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd" exitCode=0 Feb 20 09:11:00 crc kubenswrapper[5094]: I0220 09:11:00.195424 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerDied","Data":"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd"} Feb 20 09:11:00 crc kubenswrapper[5094]: I0220 09:11:00.198190 5094 generic.go:334] "Generic (PLEG): container finished" podID="e890bf4c-20ec-4b45-936d-d08d3a73b5ee" containerID="53d37e4df512781460481b64b9d79b108f7fa1fdc52b31081e79c7d0aef5a644" exitCode=0 Feb 20 09:11:00 crc kubenswrapper[5094]: I0220 09:11:00.198220 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-tqhtr" event={"ID":"e890bf4c-20ec-4b45-936d-d08d3a73b5ee","Type":"ContainerDied","Data":"53d37e4df512781460481b64b9d79b108f7fa1fdc52b31081e79c7d0aef5a644"} Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.209391 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerStarted","Data":"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8"} Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.733712 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.759918 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-s8vl2" podStartSLOduration=3.078224994 podStartE2EDuration="7.759865775s" podCreationTimestamp="2026-02-20 09:10:54 +0000 UTC" firstStartedPulling="2026-02-20 09:10:56.148472046 +0000 UTC m=+8671.021098757" lastFinishedPulling="2026-02-20 09:11:00.830112827 +0000 UTC m=+8675.702739538" observedRunningTime="2026-02-20 09:11:01.233766518 +0000 UTC m=+8676.106393229" watchObservedRunningTime="2026-02-20 09:11:01.759865775 +0000 UTC m=+8676.632492506" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.790716 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") pod \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.790935 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") pod \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.791047 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") pod \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\" (UID: \"e890bf4c-20ec-4b45-936d-d08d3a73b5ee\") " Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.816037 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz" (OuterVolumeSpecName: "kube-api-access-r4cjz") pod "e890bf4c-20ec-4b45-936d-d08d3a73b5ee" (UID: "e890bf4c-20ec-4b45-936d-d08d3a73b5ee"). InnerVolumeSpecName "kube-api-access-r4cjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.819493 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory" (OuterVolumeSpecName: "inventory") pod "e890bf4c-20ec-4b45-936d-d08d3a73b5ee" (UID: "e890bf4c-20ec-4b45-936d-d08d3a73b5ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.832284 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "e890bf4c-20ec-4b45-936d-d08d3a73b5ee" (UID: "e890bf4c-20ec-4b45-936d-d08d3a73b5ee"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.893209 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4cjz\" (UniqueName: \"kubernetes.io/projected/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-kube-api-access-r4cjz\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.893244 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:01 crc kubenswrapper[5094]: I0220 09:11:01.893257 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e890bf4c-20ec-4b45-936d-d08d3a73b5ee-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.220371 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-networker-tqhtr" event={"ID":"e890bf4c-20ec-4b45-936d-d08d3a73b5ee","Type":"ContainerDied","Data":"baa0af510f31ddefaf054818af5e9e587e1051876dcf44574d86835d0dea5bbc"} Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.220418 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baa0af510f31ddefaf054818af5e9e587e1051876dcf44574d86835d0dea5bbc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.220512 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-networker-tqhtr" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.323200 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-networker-ts6jc"] Feb 20 09:11:02 crc kubenswrapper[5094]: E0220 09:11:02.323788 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e890bf4c-20ec-4b45-936d-d08d3a73b5ee" containerName="install-os-openstack-openstack-networker" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.323826 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e890bf4c-20ec-4b45-936d-d08d3a73b5ee" containerName="install-os-openstack-openstack-networker" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.324017 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e890bf4c-20ec-4b45-936d-d08d3a73b5ee" containerName="install-os-openstack-openstack-networker" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.324785 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.330480 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.330724 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.335073 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-ts6jc"] Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.404479 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.405008 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.405210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.506164 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.506292 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.506369 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.515605 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.515681 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.522985 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") pod \"configure-os-openstack-openstack-networker-ts6jc\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:02 crc kubenswrapper[5094]: I0220 09:11:02.649813 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:03 crc kubenswrapper[5094]: I0220 09:11:03.243438 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-networker-ts6jc"] Feb 20 09:11:03 crc kubenswrapper[5094]: W0220 09:11:03.259815 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9278a86a_be7e_4e04_a187_52d0c119ccb5.slice/crio-53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a WatchSource:0}: Error finding container 53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a: Status 404 returned error can't find the container with id 53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.263885 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" event={"ID":"9278a86a-be7e-4e04-a187-52d0c119ccb5","Type":"ContainerStarted","Data":"0ecff3a7f81df043e7dc17dcab78f646edff610b7b2c8f1e0759f2364b70587c"} Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.264401 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" event={"ID":"9278a86a-be7e-4e04-a187-52d0c119ccb5","Type":"ContainerStarted","Data":"53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a"} Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.295813 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" podStartSLOduration=1.802982568 podStartE2EDuration="2.295793275s" podCreationTimestamp="2026-02-20 09:11:02 +0000 UTC" firstStartedPulling="2026-02-20 09:11:03.262617228 +0000 UTC m=+8678.135243939" lastFinishedPulling="2026-02-20 09:11:03.755427935 +0000 UTC m=+8678.628054646" observedRunningTime="2026-02-20 09:11:04.282115656 +0000 UTC m=+8679.154742357" watchObservedRunningTime="2026-02-20 09:11:04.295793275 +0000 UTC m=+8679.168419986" Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.893088 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:04 crc kubenswrapper[5094]: I0220 09:11:04.893386 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:05 crc kubenswrapper[5094]: I0220 09:11:05.941882 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-s8vl2" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" probeResult="failure" output=< Feb 20 09:11:05 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:11:05 crc kubenswrapper[5094]: > Feb 20 09:11:10 crc kubenswrapper[5094]: I0220 09:11:10.840557 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:11:10 crc kubenswrapper[5094]: E0220 09:11:10.841741 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:11:14 crc kubenswrapper[5094]: I0220 09:11:14.951133 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:15 crc kubenswrapper[5094]: I0220 09:11:15.015838 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:15 crc kubenswrapper[5094]: I0220 09:11:15.195975 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:11:16 crc kubenswrapper[5094]: I0220 09:11:16.368293 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-s8vl2" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" containerID="cri-o://0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" gracePeriod=2 Feb 20 09:11:16 crc kubenswrapper[5094]: I0220 09:11:16.946185 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.114851 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") pod \"d7bd9285-f186-43d4-a61f-181c864d71f6\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.115337 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") pod \"d7bd9285-f186-43d4-a61f-181c864d71f6\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.115589 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") pod \"d7bd9285-f186-43d4-a61f-181c864d71f6\" (UID: \"d7bd9285-f186-43d4-a61f-181c864d71f6\") " Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.117033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities" (OuterVolumeSpecName: "utilities") pod "d7bd9285-f186-43d4-a61f-181c864d71f6" (UID: "d7bd9285-f186-43d4-a61f-181c864d71f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.125661 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr" (OuterVolumeSpecName: "kube-api-access-bzbxr") pod "d7bd9285-f186-43d4-a61f-181c864d71f6" (UID: "d7bd9285-f186-43d4-a61f-181c864d71f6"). InnerVolumeSpecName "kube-api-access-bzbxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.218606 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzbxr\" (UniqueName: \"kubernetes.io/projected/d7bd9285-f186-43d4-a61f-181c864d71f6-kube-api-access-bzbxr\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.218653 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.266439 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7bd9285-f186-43d4-a61f-181c864d71f6" (UID: "d7bd9285-f186-43d4-a61f-181c864d71f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.321338 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7bd9285-f186-43d4-a61f-181c864d71f6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.381803 5094 generic.go:334] "Generic (PLEG): container finished" podID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerID="0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" exitCode=0 Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.381857 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-s8vl2" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.381879 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerDied","Data":"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8"} Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.382834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-s8vl2" event={"ID":"d7bd9285-f186-43d4-a61f-181c864d71f6","Type":"ContainerDied","Data":"59fc4a663a25aaeb18ed744c2850733f6000d6cb2136491a9705a24d5fd293ba"} Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.382875 5094 scope.go:117] "RemoveContainer" containerID="0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.422416 5094 scope.go:117] "RemoveContainer" containerID="c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.447182 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.464289 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-s8vl2"] Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.468394 5094 scope.go:117] "RemoveContainer" containerID="a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.511870 5094 scope.go:117] "RemoveContainer" containerID="0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" Feb 20 09:11:17 crc kubenswrapper[5094]: E0220 09:11:17.512821 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8\": container with ID starting with 0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8 not found: ID does not exist" containerID="0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.512884 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8"} err="failed to get container status \"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8\": rpc error: code = NotFound desc = could not find container \"0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8\": container with ID starting with 0d6c06f728ca5a8b0a916a0cfd96aec12b857fc57dad580a12deebac3cd3cfa8 not found: ID does not exist" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.512917 5094 scope.go:117] "RemoveContainer" containerID="c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd" Feb 20 09:11:17 crc kubenswrapper[5094]: E0220 09:11:17.513380 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd\": container with ID starting with c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd not found: ID does not exist" containerID="c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.513483 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd"} err="failed to get container status \"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd\": rpc error: code = NotFound desc = could not find container \"c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd\": container with ID starting with c106cc22d51f23b39a2815350e663e320ffd605e6662da151be17f5f02a81fcd not found: ID does not exist" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.513636 5094 scope.go:117] "RemoveContainer" containerID="a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8" Feb 20 09:11:17 crc kubenswrapper[5094]: E0220 09:11:17.514286 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8\": container with ID starting with a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8 not found: ID does not exist" containerID="a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.514315 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8"} err="failed to get container status \"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8\": rpc error: code = NotFound desc = could not find container \"a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8\": container with ID starting with a7de66aec64ebf91bc7e8066befb67b49d7ff32daf54e95d5a2fc05f41f9c3f8 not found: ID does not exist" Feb 20 09:11:17 crc kubenswrapper[5094]: I0220 09:11:17.855951 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" path="/var/lib/kubelet/pods/d7bd9285-f186-43d4-a61f-181c864d71f6/volumes" Feb 20 09:11:24 crc kubenswrapper[5094]: I0220 09:11:24.841583 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:11:24 crc kubenswrapper[5094]: E0220 09:11:24.842321 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:11:37 crc kubenswrapper[5094]: I0220 09:11:37.840927 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:11:38 crc kubenswrapper[5094]: I0220 09:11:38.604762 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf"} Feb 20 09:11:39 crc kubenswrapper[5094]: I0220 09:11:39.615567 5094 generic.go:334] "Generic (PLEG): container finished" podID="27e4bec3-7ef3-4f1d-897d-99909f817f5e" containerID="7453b73c41008eee230417fe9bbf595522244c350a0fb1c6a70aacaecfc7b2af" exitCode=0 Feb 20 09:11:39 crc kubenswrapper[5094]: I0220 09:11:39.615600 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" event={"ID":"27e4bec3-7ef3-4f1d-897d-99909f817f5e","Type":"ContainerDied","Data":"7453b73c41008eee230417fe9bbf595522244c350a0fb1c6a70aacaecfc7b2af"} Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.124738 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.271611 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") pod \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.271690 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") pod \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.271888 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") pod \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.271968 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") pod \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\" (UID: \"27e4bec3-7ef3-4f1d-897d-99909f817f5e\") " Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.278894 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph" (OuterVolumeSpecName: "ceph") pod "27e4bec3-7ef3-4f1d-897d-99909f817f5e" (UID: "27e4bec3-7ef3-4f1d-897d-99909f817f5e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.279000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn" (OuterVolumeSpecName: "kube-api-access-ktxwn") pod "27e4bec3-7ef3-4f1d-897d-99909f817f5e" (UID: "27e4bec3-7ef3-4f1d-897d-99909f817f5e"). InnerVolumeSpecName "kube-api-access-ktxwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.306891 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "27e4bec3-7ef3-4f1d-897d-99909f817f5e" (UID: "27e4bec3-7ef3-4f1d-897d-99909f817f5e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.322605 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory" (OuterVolumeSpecName: "inventory") pod "27e4bec3-7ef3-4f1d-897d-99909f817f5e" (UID: "27e4bec3-7ef3-4f1d-897d-99909f817f5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.375087 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktxwn\" (UniqueName: \"kubernetes.io/projected/27e4bec3-7ef3-4f1d-897d-99909f817f5e-kube-api-access-ktxwn\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.375125 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.375140 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.375151 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/27e4bec3-7ef3-4f1d-897d-99909f817f5e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.635579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" event={"ID":"27e4bec3-7ef3-4f1d-897d-99909f817f5e","Type":"ContainerDied","Data":"2213442e38aa4ed71dfd4b1cb999e22f654d497f4fc2fb09a72fef9b37908a01"} Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.635608 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-8j6nf" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.635617 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2213442e38aa4ed71dfd4b1cb999e22f654d497f4fc2fb09a72fef9b37908a01" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722152 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-x6wr7"] Feb 20 09:11:41 crc kubenswrapper[5094]: E0220 09:11:41.722578 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="extract-content" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722592 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="extract-content" Feb 20 09:11:41 crc kubenswrapper[5094]: E0220 09:11:41.722623 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="extract-utilities" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722630 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="extract-utilities" Feb 20 09:11:41 crc kubenswrapper[5094]: E0220 09:11:41.722637 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e4bec3-7ef3-4f1d-897d-99909f817f5e" containerName="install-os-openstack-openstack-cell1" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722644 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e4bec3-7ef3-4f1d-897d-99909f817f5e" containerName="install-os-openstack-openstack-cell1" Feb 20 09:11:41 crc kubenswrapper[5094]: E0220 09:11:41.722656 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722661 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722922 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e4bec3-7ef3-4f1d-897d-99909f817f5e" containerName="install-os-openstack-openstack-cell1" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.722951 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bd9285-f186-43d4-a61f-181c864d71f6" containerName="registry-server" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.733397 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-x6wr7"] Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.733504 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.737445 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.741550 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.883826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.883894 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.884068 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.884163 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.986285 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.986365 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.986430 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.986470 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.990541 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.990753 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:41 crc kubenswrapper[5094]: I0220 09:11:41.990836 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:42 crc kubenswrapper[5094]: I0220 09:11:42.006822 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") pod \"configure-os-openstack-openstack-cell1-x6wr7\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:42 crc kubenswrapper[5094]: I0220 09:11:42.050001 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:11:42 crc kubenswrapper[5094]: I0220 09:11:42.579919 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-x6wr7"] Feb 20 09:11:42 crc kubenswrapper[5094]: I0220 09:11:42.645752 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" event={"ID":"dd90b879-7bfd-480f-b25e-b7aef96a4b08","Type":"ContainerStarted","Data":"2d2225b86ca01f28b60c92f79f63009b9f37ea550a1f62fbf59736694ec08b08"} Feb 20 09:11:43 crc kubenswrapper[5094]: I0220 09:11:43.654511 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" event={"ID":"dd90b879-7bfd-480f-b25e-b7aef96a4b08","Type":"ContainerStarted","Data":"e10f72773552a2cc40555e02e367ba4fdd371e010afefe0f9daa523b1f3f9ffa"} Feb 20 09:11:43 crc kubenswrapper[5094]: I0220 09:11:43.672403 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" podStartSLOduration=2.151355225 podStartE2EDuration="2.672388601s" podCreationTimestamp="2026-02-20 09:11:41 +0000 UTC" firstStartedPulling="2026-02-20 09:11:42.581993238 +0000 UTC m=+8717.454619949" lastFinishedPulling="2026-02-20 09:11:43.103026624 +0000 UTC m=+8717.975653325" observedRunningTime="2026-02-20 09:11:43.669946013 +0000 UTC m=+8718.542572724" watchObservedRunningTime="2026-02-20 09:11:43.672388601 +0000 UTC m=+8718.545015312" Feb 20 09:11:53 crc kubenswrapper[5094]: I0220 09:11:53.759898 5094 generic.go:334] "Generic (PLEG): container finished" podID="9278a86a-be7e-4e04-a187-52d0c119ccb5" containerID="0ecff3a7f81df043e7dc17dcab78f646edff610b7b2c8f1e0759f2364b70587c" exitCode=0 Feb 20 09:11:53 crc kubenswrapper[5094]: I0220 09:11:53.760483 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" event={"ID":"9278a86a-be7e-4e04-a187-52d0c119ccb5","Type":"ContainerDied","Data":"0ecff3a7f81df043e7dc17dcab78f646edff610b7b2c8f1e0759f2364b70587c"} Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.254190 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.371839 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") pod \"9278a86a-be7e-4e04-a187-52d0c119ccb5\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.372184 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") pod \"9278a86a-be7e-4e04-a187-52d0c119ccb5\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.372278 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") pod \"9278a86a-be7e-4e04-a187-52d0c119ccb5\" (UID: \"9278a86a-be7e-4e04-a187-52d0c119ccb5\") " Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.385761 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb" (OuterVolumeSpecName: "kube-api-access-m2dnb") pod "9278a86a-be7e-4e04-a187-52d0c119ccb5" (UID: "9278a86a-be7e-4e04-a187-52d0c119ccb5"). InnerVolumeSpecName "kube-api-access-m2dnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.399912 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "9278a86a-be7e-4e04-a187-52d0c119ccb5" (UID: "9278a86a-be7e-4e04-a187-52d0c119ccb5"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.400862 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory" (OuterVolumeSpecName: "inventory") pod "9278a86a-be7e-4e04-a187-52d0c119ccb5" (UID: "9278a86a-be7e-4e04-a187-52d0c119ccb5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.475124 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.475164 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2dnb\" (UniqueName: \"kubernetes.io/projected/9278a86a-be7e-4e04-a187-52d0c119ccb5-kube-api-access-m2dnb\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.475180 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/9278a86a-be7e-4e04-a187-52d0c119ccb5-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.781188 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" event={"ID":"9278a86a-be7e-4e04-a187-52d0c119ccb5","Type":"ContainerDied","Data":"53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a"} Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.781495 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53482a89b4aa27a085ef4356504526ea239c0913cf6c28f257bd691f3a182a2a" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.781785 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-networker-ts6jc" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.868381 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-networker-f78nb"] Feb 20 09:11:55 crc kubenswrapper[5094]: E0220 09:11:55.868968 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9278a86a-be7e-4e04-a187-52d0c119ccb5" containerName="configure-os-openstack-openstack-networker" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.868994 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="9278a86a-be7e-4e04-a187-52d0c119ccb5" containerName="configure-os-openstack-openstack-networker" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.869191 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="9278a86a-be7e-4e04-a187-52d0c119ccb5" containerName="configure-os-openstack-openstack-networker" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.869981 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.878964 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.879158 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.933319 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-f78nb"] Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.983942 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.984028 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:55 crc kubenswrapper[5094]: I0220 09:11:55.984469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.086529 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.087788 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.088547 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.093041 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.093079 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.105159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") pod \"run-os-openstack-openstack-networker-f78nb\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.238521 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:11:56 crc kubenswrapper[5094]: I0220 09:11:56.838166 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-networker-f78nb"] Feb 20 09:11:56 crc kubenswrapper[5094]: W0220 09:11:56.842528 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95f054cd_db3e_45e0_9e12_55c2da3b5a23.slice/crio-733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8 WatchSource:0}: Error finding container 733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8: Status 404 returned error can't find the container with id 733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8 Feb 20 09:11:57 crc kubenswrapper[5094]: I0220 09:11:57.803696 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-f78nb" event={"ID":"95f054cd-db3e-45e0-9e12-55c2da3b5a23","Type":"ContainerStarted","Data":"4d50aac56336d9f1985d507ee643e6f65ac7303c55439b06997bd5c0b8b104d5"} Feb 20 09:11:57 crc kubenswrapper[5094]: I0220 09:11:57.804087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-f78nb" event={"ID":"95f054cd-db3e-45e0-9e12-55c2da3b5a23","Type":"ContainerStarted","Data":"733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8"} Feb 20 09:11:57 crc kubenswrapper[5094]: I0220 09:11:57.825844 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-networker-f78nb" podStartSLOduration=2.226813976 podStartE2EDuration="2.825823966s" podCreationTimestamp="2026-02-20 09:11:55 +0000 UTC" firstStartedPulling="2026-02-20 09:11:56.846906775 +0000 UTC m=+8731.719533486" lastFinishedPulling="2026-02-20 09:11:57.445916755 +0000 UTC m=+8732.318543476" observedRunningTime="2026-02-20 09:11:57.820840565 +0000 UTC m=+8732.693467276" watchObservedRunningTime="2026-02-20 09:11:57.825823966 +0000 UTC m=+8732.698450677" Feb 20 09:12:05 crc kubenswrapper[5094]: I0220 09:12:05.881641 5094 generic.go:334] "Generic (PLEG): container finished" podID="95f054cd-db3e-45e0-9e12-55c2da3b5a23" containerID="4d50aac56336d9f1985d507ee643e6f65ac7303c55439b06997bd5c0b8b104d5" exitCode=0 Feb 20 09:12:05 crc kubenswrapper[5094]: I0220 09:12:05.881779 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-f78nb" event={"ID":"95f054cd-db3e-45e0-9e12-55c2da3b5a23","Type":"ContainerDied","Data":"4d50aac56336d9f1985d507ee643e6f65ac7303c55439b06997bd5c0b8b104d5"} Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.384742 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.411318 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") pod \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.411389 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") pod \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.411440 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") pod \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\" (UID: \"95f054cd-db3e-45e0-9e12-55c2da3b5a23\") " Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.417251 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js" (OuterVolumeSpecName: "kube-api-access-5t9js") pod "95f054cd-db3e-45e0-9e12-55c2da3b5a23" (UID: "95f054cd-db3e-45e0-9e12-55c2da3b5a23"). InnerVolumeSpecName "kube-api-access-5t9js". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.444000 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory" (OuterVolumeSpecName: "inventory") pod "95f054cd-db3e-45e0-9e12-55c2da3b5a23" (UID: "95f054cd-db3e-45e0-9e12-55c2da3b5a23"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.445103 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "95f054cd-db3e-45e0-9e12-55c2da3b5a23" (UID: "95f054cd-db3e-45e0-9e12-55c2da3b5a23"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.520231 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t9js\" (UniqueName: \"kubernetes.io/projected/95f054cd-db3e-45e0-9e12-55c2da3b5a23-kube-api-access-5t9js\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.520278 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.520288 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/95f054cd-db3e-45e0-9e12-55c2da3b5a23-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.901506 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-networker-f78nb" event={"ID":"95f054cd-db3e-45e0-9e12-55c2da3b5a23","Type":"ContainerDied","Data":"733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8"} Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.901549 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="733bbb9a6c003e5dd19885f94d1191a64cd06c603ddd3f8a22990f44410e13d8" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.901564 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-networker-f78nb" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.968442 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-x57s5"] Feb 20 09:12:07 crc kubenswrapper[5094]: E0220 09:12:07.969101 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f054cd-db3e-45e0-9e12-55c2da3b5a23" containerName="run-os-openstack-openstack-networker" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.969127 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f054cd-db3e-45e0-9e12-55c2da3b5a23" containerName="run-os-openstack-openstack-networker" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.969360 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f054cd-db3e-45e0-9e12-55c2da3b5a23" containerName="run-os-openstack-openstack-networker" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.970312 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.972850 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.976874 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:12:07 crc kubenswrapper[5094]: I0220 09:12:07.977825 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-x57s5"] Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.037109 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.037185 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.037224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.139407 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.139489 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.139524 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.143445 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.146287 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.154802 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") pod \"reboot-os-openstack-openstack-networker-x57s5\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.302654 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.848794 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-networker-x57s5"] Feb 20 09:12:08 crc kubenswrapper[5094]: W0220 09:12:08.850971 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e673130_22d3_4300_a143_c2821deb8cac.slice/crio-c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3 WatchSource:0}: Error finding container c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3: Status 404 returned error can't find the container with id c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3 Feb 20 09:12:08 crc kubenswrapper[5094]: I0220 09:12:08.912741 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" event={"ID":"5e673130-22d3-4300-a143-c2821deb8cac","Type":"ContainerStarted","Data":"c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3"} Feb 20 09:12:09 crc kubenswrapper[5094]: I0220 09:12:09.925343 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" event={"ID":"5e673130-22d3-4300-a143-c2821deb8cac","Type":"ContainerStarted","Data":"f4999008d71561b3987b5f2c05b58a5e54c18f25547d77eaa61d1ea04c4081ea"} Feb 20 09:12:09 crc kubenswrapper[5094]: I0220 09:12:09.947054 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" podStartSLOduration=2.172587316 podStartE2EDuration="2.947034689s" podCreationTimestamp="2026-02-20 09:12:07 +0000 UTC" firstStartedPulling="2026-02-20 09:12:08.852740932 +0000 UTC m=+8743.725367643" lastFinishedPulling="2026-02-20 09:12:09.627188275 +0000 UTC m=+8744.499815016" observedRunningTime="2026-02-20 09:12:09.939557449 +0000 UTC m=+8744.812184170" watchObservedRunningTime="2026-02-20 09:12:09.947034689 +0000 UTC m=+8744.819661400" Feb 20 09:12:24 crc kubenswrapper[5094]: I0220 09:12:24.072885 5094 generic.go:334] "Generic (PLEG): container finished" podID="5e673130-22d3-4300-a143-c2821deb8cac" containerID="f4999008d71561b3987b5f2c05b58a5e54c18f25547d77eaa61d1ea04c4081ea" exitCode=0 Feb 20 09:12:24 crc kubenswrapper[5094]: I0220 09:12:24.072989 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" event={"ID":"5e673130-22d3-4300-a143-c2821deb8cac","Type":"ContainerDied","Data":"f4999008d71561b3987b5f2c05b58a5e54c18f25547d77eaa61d1ea04c4081ea"} Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.604731 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.725723 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") pod \"5e673130-22d3-4300-a143-c2821deb8cac\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.726057 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") pod \"5e673130-22d3-4300-a143-c2821deb8cac\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.726092 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") pod \"5e673130-22d3-4300-a143-c2821deb8cac\" (UID: \"5e673130-22d3-4300-a143-c2821deb8cac\") " Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.734968 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj" (OuterVolumeSpecName: "kube-api-access-d4pcj") pod "5e673130-22d3-4300-a143-c2821deb8cac" (UID: "5e673130-22d3-4300-a143-c2821deb8cac"). InnerVolumeSpecName "kube-api-access-d4pcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.753109 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "5e673130-22d3-4300-a143-c2821deb8cac" (UID: "5e673130-22d3-4300-a143-c2821deb8cac"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.761207 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory" (OuterVolumeSpecName: "inventory") pod "5e673130-22d3-4300-a143-c2821deb8cac" (UID: "5e673130-22d3-4300-a143-c2821deb8cac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.829287 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.829342 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4pcj\" (UniqueName: \"kubernetes.io/projected/5e673130-22d3-4300-a143-c2821deb8cac-kube-api-access-d4pcj\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:25 crc kubenswrapper[5094]: I0220 09:12:25.829356 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e673130-22d3-4300-a143-c2821deb8cac-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.093608 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" event={"ID":"5e673130-22d3-4300-a143-c2821deb8cac","Type":"ContainerDied","Data":"c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3"} Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.093651 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c883ef19578f8d353a6ef42c0db66ef99ceed01665ac9d0bfc434b57a06445b3" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.093652 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-networker-x57s5" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.200240 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-networker-xdx54"] Feb 20 09:12:26 crc kubenswrapper[5094]: E0220 09:12:26.201051 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e673130-22d3-4300-a143-c2821deb8cac" containerName="reboot-os-openstack-openstack-networker" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.201073 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e673130-22d3-4300-a143-c2821deb8cac" containerName="reboot-os-openstack-openstack-networker" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.201333 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e673130-22d3-4300-a143-c2821deb8cac" containerName="reboot-os-openstack-openstack-networker" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.202165 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.209312 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-xdx54"] Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.237419 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.237650 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249572 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249677 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249734 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249826 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249873 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.249897 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351600 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351669 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351690 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351778 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351828 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.351856 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.358440 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.358807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.360879 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.361120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.366384 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.371872 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") pod \"install-certs-openstack-openstack-networker-xdx54\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:26 crc kubenswrapper[5094]: I0220 09:12:26.554277 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:27 crc kubenswrapper[5094]: I0220 09:12:27.098237 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-networker-xdx54"] Feb 20 09:12:28 crc kubenswrapper[5094]: I0220 09:12:28.114878 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-xdx54" event={"ID":"21d27f85-64a1-4dc5-af39-89275cce2427","Type":"ContainerStarted","Data":"6c17819c117daffaa74cb33a9b4a26dc4bfdc09ad0bd109886743a58f1bd45bb"} Feb 20 09:12:28 crc kubenswrapper[5094]: I0220 09:12:28.115163 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-xdx54" event={"ID":"21d27f85-64a1-4dc5-af39-89275cce2427","Type":"ContainerStarted","Data":"c2f19ab64377ee28e9a91ca87c0aaae5c486f7814233b439c782195562b6cbca"} Feb 20 09:12:28 crc kubenswrapper[5094]: I0220 09:12:28.147183 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-networker-xdx54" podStartSLOduration=1.7502919129999999 podStartE2EDuration="2.14715875s" podCreationTimestamp="2026-02-20 09:12:26 +0000 UTC" firstStartedPulling="2026-02-20 09:12:27.1042726 +0000 UTC m=+8761.976899311" lastFinishedPulling="2026-02-20 09:12:27.501139437 +0000 UTC m=+8762.373766148" observedRunningTime="2026-02-20 09:12:28.134181957 +0000 UTC m=+8763.006808678" watchObservedRunningTime="2026-02-20 09:12:28.14715875 +0000 UTC m=+8763.019785461" Feb 20 09:12:32 crc kubenswrapper[5094]: I0220 09:12:32.161566 5094 generic.go:334] "Generic (PLEG): container finished" podID="dd90b879-7bfd-480f-b25e-b7aef96a4b08" containerID="e10f72773552a2cc40555e02e367ba4fdd371e010afefe0f9daa523b1f3f9ffa" exitCode=0 Feb 20 09:12:32 crc kubenswrapper[5094]: I0220 09:12:32.161649 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" event={"ID":"dd90b879-7bfd-480f-b25e-b7aef96a4b08","Type":"ContainerDied","Data":"e10f72773552a2cc40555e02e367ba4fdd371e010afefe0f9daa523b1f3f9ffa"} Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.685411 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.814438 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") pod \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.814573 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") pod \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.814687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") pod \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.814844 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") pod \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\" (UID: \"dd90b879-7bfd-480f-b25e-b7aef96a4b08\") " Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.827988 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph" (OuterVolumeSpecName: "ceph") pod "dd90b879-7bfd-480f-b25e-b7aef96a4b08" (UID: "dd90b879-7bfd-480f-b25e-b7aef96a4b08"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.828065 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54" (OuterVolumeSpecName: "kube-api-access-qjq54") pod "dd90b879-7bfd-480f-b25e-b7aef96a4b08" (UID: "dd90b879-7bfd-480f-b25e-b7aef96a4b08"). InnerVolumeSpecName "kube-api-access-qjq54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.842097 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "dd90b879-7bfd-480f-b25e-b7aef96a4b08" (UID: "dd90b879-7bfd-480f-b25e-b7aef96a4b08"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.842889 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory" (OuterVolumeSpecName: "inventory") pod "dd90b879-7bfd-480f-b25e-b7aef96a4b08" (UID: "dd90b879-7bfd-480f-b25e-b7aef96a4b08"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.916745 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.916788 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjq54\" (UniqueName: \"kubernetes.io/projected/dd90b879-7bfd-480f-b25e-b7aef96a4b08-kube-api-access-qjq54\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.916800 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:33 crc kubenswrapper[5094]: I0220 09:12:33.916809 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dd90b879-7bfd-480f-b25e-b7aef96a4b08-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.181727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" event={"ID":"dd90b879-7bfd-480f-b25e-b7aef96a4b08","Type":"ContainerDied","Data":"2d2225b86ca01f28b60c92f79f63009b9f37ea550a1f62fbf59736694ec08b08"} Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.181765 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d2225b86ca01f28b60c92f79f63009b9f37ea550a1f62fbf59736694ec08b08" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.181828 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-x6wr7" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.317560 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-zb287"] Feb 20 09:12:34 crc kubenswrapper[5094]: E0220 09:12:34.318094 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd90b879-7bfd-480f-b25e-b7aef96a4b08" containerName="configure-os-openstack-openstack-cell1" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.318118 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd90b879-7bfd-480f-b25e-b7aef96a4b08" containerName="configure-os-openstack-openstack-cell1" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.318381 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd90b879-7bfd-480f-b25e-b7aef96a4b08" containerName="configure-os-openstack-openstack-cell1" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.332501 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-zb287"] Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.332608 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.337294 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.337523 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.426859 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.426905 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.426946 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.426996 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.427017 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.427041 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528771 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528822 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528861 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528918 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528944 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.528969 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.535427 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.536170 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.540317 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.540804 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.552120 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.552479 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") pod \"ssh-known-hosts-openstack-zb287\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:34 crc kubenswrapper[5094]: I0220 09:12:34.682693 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:35 crc kubenswrapper[5094]: I0220 09:12:35.182420 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-zb287"] Feb 20 09:12:36 crc kubenswrapper[5094]: I0220 09:12:36.202973 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zb287" event={"ID":"387ade3f-0ebb-4488-8a04-389a018fc31d","Type":"ContainerStarted","Data":"b79daef797bf727b856e9ca751d28441e3426324d5faa4cbdf09b3eda8c461b5"} Feb 20 09:12:36 crc kubenswrapper[5094]: I0220 09:12:36.203793 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zb287" event={"ID":"387ade3f-0ebb-4488-8a04-389a018fc31d","Type":"ContainerStarted","Data":"2a9b407833d6a7dbd7bb656dedee626bc225b77bd13ecbd8d16ede5733462de6"} Feb 20 09:12:36 crc kubenswrapper[5094]: I0220 09:12:36.231534 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-zb287" podStartSLOduration=1.754032516 podStartE2EDuration="2.231513683s" podCreationTimestamp="2026-02-20 09:12:34 +0000 UTC" firstStartedPulling="2026-02-20 09:12:35.18519555 +0000 UTC m=+8770.057822261" lastFinishedPulling="2026-02-20 09:12:35.662676717 +0000 UTC m=+8770.535303428" observedRunningTime="2026-02-20 09:12:36.225432036 +0000 UTC m=+8771.098058757" watchObservedRunningTime="2026-02-20 09:12:36.231513683 +0000 UTC m=+8771.104140394" Feb 20 09:12:38 crc kubenswrapper[5094]: I0220 09:12:38.224556 5094 generic.go:334] "Generic (PLEG): container finished" podID="21d27f85-64a1-4dc5-af39-89275cce2427" containerID="6c17819c117daffaa74cb33a9b4a26dc4bfdc09ad0bd109886743a58f1bd45bb" exitCode=0 Feb 20 09:12:38 crc kubenswrapper[5094]: I0220 09:12:38.224623 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-xdx54" event={"ID":"21d27f85-64a1-4dc5-af39-89275cce2427","Type":"ContainerDied","Data":"6c17819c117daffaa74cb33a9b4a26dc4bfdc09ad0bd109886743a58f1bd45bb"} Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.683956 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843310 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843517 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843566 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843612 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843775 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.843914 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") pod \"21d27f85-64a1-4dc5-af39-89275cce2427\" (UID: \"21d27f85-64a1-4dc5-af39-89275cce2427\") " Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.849778 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.850279 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k" (OuterVolumeSpecName: "kube-api-access-bxj2k") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "kube-api-access-bxj2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.850799 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.851195 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.876145 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.894912 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory" (OuterVolumeSpecName: "inventory") pod "21d27f85-64a1-4dc5-af39-89275cce2427" (UID: "21d27f85-64a1-4dc5-af39-89275cce2427"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948381 5094 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948435 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948453 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948473 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948595 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxj2k\" (UniqueName: \"kubernetes.io/projected/21d27f85-64a1-4dc5-af39-89275cce2427-kube-api-access-bxj2k\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:39 crc kubenswrapper[5094]: I0220 09:12:39.948621 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/21d27f85-64a1-4dc5-af39-89275cce2427-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.245186 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-networker-xdx54" event={"ID":"21d27f85-64a1-4dc5-af39-89275cce2427","Type":"ContainerDied","Data":"c2f19ab64377ee28e9a91ca87c0aaae5c486f7814233b439c782195562b6cbca"} Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.245497 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2f19ab64377ee28e9a91ca87c0aaae5c486f7814233b439c782195562b6cbca" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.245250 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-networker-xdx54" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.329961 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-networker-r6vzk"] Feb 20 09:12:40 crc kubenswrapper[5094]: E0220 09:12:40.330440 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d27f85-64a1-4dc5-af39-89275cce2427" containerName="install-certs-openstack-openstack-networker" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.330457 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d27f85-64a1-4dc5-af39-89275cce2427" containerName="install-certs-openstack-openstack-networker" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.330684 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d27f85-64a1-4dc5-af39-89275cce2427" containerName="install-certs-openstack-openstack-networker" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.331385 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.333591 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.333718 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.342942 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-r6vzk"] Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.458905 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.458954 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.458979 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.459328 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.459593 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.561906 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.562233 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.562331 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.562431 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.562587 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.564374 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.566020 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.572288 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.573224 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.589484 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") pod \"ovn-openstack-openstack-networker-r6vzk\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:40 crc kubenswrapper[5094]: I0220 09:12:40.649888 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:12:41 crc kubenswrapper[5094]: W0220 09:12:41.211782 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ef4f2ef_92a7_4d12_94a9_e3ee55412547.slice/crio-e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b WatchSource:0}: Error finding container e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b: Status 404 returned error can't find the container with id e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b Feb 20 09:12:41 crc kubenswrapper[5094]: I0220 09:12:41.214746 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-networker-r6vzk"] Feb 20 09:12:41 crc kubenswrapper[5094]: I0220 09:12:41.258294 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-r6vzk" event={"ID":"3ef4f2ef-92a7-4d12-94a9-e3ee55412547","Type":"ContainerStarted","Data":"e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b"} Feb 20 09:12:42 crc kubenswrapper[5094]: I0220 09:12:42.268505 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-r6vzk" event={"ID":"3ef4f2ef-92a7-4d12-94a9-e3ee55412547","Type":"ContainerStarted","Data":"a7bba99fdce7206f2a33a9b6752071dfb931fd478046625f8eac18d1bd63f378"} Feb 20 09:12:42 crc kubenswrapper[5094]: I0220 09:12:42.287718 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-networker-r6vzk" podStartSLOduration=1.807364718 podStartE2EDuration="2.287688833s" podCreationTimestamp="2026-02-20 09:12:40 +0000 UTC" firstStartedPulling="2026-02-20 09:12:41.213780377 +0000 UTC m=+8776.086407098" lastFinishedPulling="2026-02-20 09:12:41.694104492 +0000 UTC m=+8776.566731213" observedRunningTime="2026-02-20 09:12:42.281841342 +0000 UTC m=+8777.154468053" watchObservedRunningTime="2026-02-20 09:12:42.287688833 +0000 UTC m=+8777.160315544" Feb 20 09:12:50 crc kubenswrapper[5094]: I0220 09:12:50.355674 5094 generic.go:334] "Generic (PLEG): container finished" podID="387ade3f-0ebb-4488-8a04-389a018fc31d" containerID="b79daef797bf727b856e9ca751d28441e3426324d5faa4cbdf09b3eda8c461b5" exitCode=0 Feb 20 09:12:50 crc kubenswrapper[5094]: I0220 09:12:50.356001 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zb287" event={"ID":"387ade3f-0ebb-4488-8a04-389a018fc31d","Type":"ContainerDied","Data":"b79daef797bf727b856e9ca751d28441e3426324d5faa4cbdf09b3eda8c461b5"} Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.797891 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912569 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912655 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912829 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912889 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912916 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.912980 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") pod \"387ade3f-0ebb-4488-8a04-389a018fc31d\" (UID: \"387ade3f-0ebb-4488-8a04-389a018fc31d\") " Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.919904 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8" (OuterVolumeSpecName: "kube-api-access-cznz8") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "kube-api-access-cznz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.920213 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph" (OuterVolumeSpecName: "ceph") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.946429 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.947724 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1" (OuterVolumeSpecName: "inventory-1") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "inventory-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.948333 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:51 crc kubenswrapper[5094]: I0220 09:12:51.958362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "387ade3f-0ebb-4488-8a04-389a018fc31d" (UID: "387ade3f-0ebb-4488-8a04-389a018fc31d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015886 5094 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015920 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015932 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015945 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015956 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cznz8\" (UniqueName: \"kubernetes.io/projected/387ade3f-0ebb-4488-8a04-389a018fc31d-kube-api-access-cznz8\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.015968 5094 reconciler_common.go:293] "Volume detached for volume \"inventory-1\" (UniqueName: \"kubernetes.io/secret/387ade3f-0ebb-4488-8a04-389a018fc31d-inventory-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.377014 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-zb287" event={"ID":"387ade3f-0ebb-4488-8a04-389a018fc31d","Type":"ContainerDied","Data":"2a9b407833d6a7dbd7bb656dedee626bc225b77bd13ecbd8d16ede5733462de6"} Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.377058 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a9b407833d6a7dbd7bb656dedee626bc225b77bd13ecbd8d16ede5733462de6" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.377089 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-zb287" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.467898 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fwcg9"] Feb 20 09:12:52 crc kubenswrapper[5094]: E0220 09:12:52.468358 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="387ade3f-0ebb-4488-8a04-389a018fc31d" containerName="ssh-known-hosts-openstack" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.468376 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="387ade3f-0ebb-4488-8a04-389a018fc31d" containerName="ssh-known-hosts-openstack" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.468563 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="387ade3f-0ebb-4488-8a04-389a018fc31d" containerName="ssh-known-hosts-openstack" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.469320 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.471472 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.471700 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.480134 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fwcg9"] Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.628658 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.629008 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.629093 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.629146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.730683 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.730757 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.730829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.730897 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.736465 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.739167 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.739804 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.757315 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") pod \"run-os-openstack-openstack-cell1-fwcg9\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:52 crc kubenswrapper[5094]: I0220 09:12:52.822293 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:12:53 crc kubenswrapper[5094]: I0220 09:12:53.361426 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fwcg9"] Feb 20 09:12:53 crc kubenswrapper[5094]: I0220 09:12:53.389766 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" event={"ID":"73a5aa76-8e8f-4235-bd0d-294f718698fa","Type":"ContainerStarted","Data":"feb9c238dfa2d325536cf0d59da45a3a354dfd03f841db94b04d9035676d0d5c"} Feb 20 09:12:54 crc kubenswrapper[5094]: I0220 09:12:54.400119 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" event={"ID":"73a5aa76-8e8f-4235-bd0d-294f718698fa","Type":"ContainerStarted","Data":"4c5cfabbfd79a25aaeb041af0c03152e4aeb36ff29c53e0b94ec44ab7b88d4b4"} Feb 20 09:12:54 crc kubenswrapper[5094]: I0220 09:12:54.426672 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" podStartSLOduration=1.9479881190000001 podStartE2EDuration="2.426649294s" podCreationTimestamp="2026-02-20 09:12:52 +0000 UTC" firstStartedPulling="2026-02-20 09:12:53.367265407 +0000 UTC m=+8788.239892118" lastFinishedPulling="2026-02-20 09:12:53.845926582 +0000 UTC m=+8788.718553293" observedRunningTime="2026-02-20 09:12:54.414551553 +0000 UTC m=+8789.287178264" watchObservedRunningTime="2026-02-20 09:12:54.426649294 +0000 UTC m=+8789.299276005" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.804148 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.808588 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.821249 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.891768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.891862 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.891931 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.993471 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.993772 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.993942 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.994305 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:58 crc kubenswrapper[5094]: I0220 09:12:58.994781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:59 crc kubenswrapper[5094]: I0220 09:12:59.034204 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") pod \"certified-operators-swkr9\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:59 crc kubenswrapper[5094]: I0220 09:12:59.129280 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:12:59 crc kubenswrapper[5094]: I0220 09:12:59.625601 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:13:00 crc kubenswrapper[5094]: I0220 09:13:00.460908 5094 generic.go:334] "Generic (PLEG): container finished" podID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerID="d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0" exitCode=0 Feb 20 09:13:00 crc kubenswrapper[5094]: I0220 09:13:00.460990 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerDied","Data":"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0"} Feb 20 09:13:00 crc kubenswrapper[5094]: I0220 09:13:00.461616 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerStarted","Data":"c914d995a63fd7594c0e4bdb37a076bf16b8dc6dce300cea3fef7134aad270b1"} Feb 20 09:13:01 crc kubenswrapper[5094]: I0220 09:13:01.474557 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerStarted","Data":"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d"} Feb 20 09:13:03 crc kubenswrapper[5094]: I0220 09:13:03.492528 5094 generic.go:334] "Generic (PLEG): container finished" podID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerID="cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d" exitCode=0 Feb 20 09:13:03 crc kubenswrapper[5094]: I0220 09:13:03.492601 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerDied","Data":"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d"} Feb 20 09:13:04 crc kubenswrapper[5094]: I0220 09:13:04.520453 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerStarted","Data":"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa"} Feb 20 09:13:04 crc kubenswrapper[5094]: I0220 09:13:04.524246 5094 generic.go:334] "Generic (PLEG): container finished" podID="73a5aa76-8e8f-4235-bd0d-294f718698fa" containerID="4c5cfabbfd79a25aaeb041af0c03152e4aeb36ff29c53e0b94ec44ab7b88d4b4" exitCode=0 Feb 20 09:13:04 crc kubenswrapper[5094]: I0220 09:13:04.524283 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" event={"ID":"73a5aa76-8e8f-4235-bd0d-294f718698fa","Type":"ContainerDied","Data":"4c5cfabbfd79a25aaeb041af0c03152e4aeb36ff29c53e0b94ec44ab7b88d4b4"} Feb 20 09:13:04 crc kubenswrapper[5094]: I0220 09:13:04.545853 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-swkr9" podStartSLOduration=3.146602253 podStartE2EDuration="6.545835542s" podCreationTimestamp="2026-02-20 09:12:58 +0000 UTC" firstStartedPulling="2026-02-20 09:13:00.464657817 +0000 UTC m=+8795.337284528" lastFinishedPulling="2026-02-20 09:13:03.863891096 +0000 UTC m=+8798.736517817" observedRunningTime="2026-02-20 09:13:04.541009997 +0000 UTC m=+8799.413636698" watchObservedRunningTime="2026-02-20 09:13:04.545835542 +0000 UTC m=+8799.418462253" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.064393 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.151607 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") pod \"73a5aa76-8e8f-4235-bd0d-294f718698fa\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.151671 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") pod \"73a5aa76-8e8f-4235-bd0d-294f718698fa\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.151733 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") pod \"73a5aa76-8e8f-4235-bd0d-294f718698fa\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.151877 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") pod \"73a5aa76-8e8f-4235-bd0d-294f718698fa\" (UID: \"73a5aa76-8e8f-4235-bd0d-294f718698fa\") " Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.158830 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph" (OuterVolumeSpecName: "ceph") pod "73a5aa76-8e8f-4235-bd0d-294f718698fa" (UID: "73a5aa76-8e8f-4235-bd0d-294f718698fa"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.159462 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds" (OuterVolumeSpecName: "kube-api-access-hjwds") pod "73a5aa76-8e8f-4235-bd0d-294f718698fa" (UID: "73a5aa76-8e8f-4235-bd0d-294f718698fa"). InnerVolumeSpecName "kube-api-access-hjwds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.180388 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "73a5aa76-8e8f-4235-bd0d-294f718698fa" (UID: "73a5aa76-8e8f-4235-bd0d-294f718698fa"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.182621 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory" (OuterVolumeSpecName: "inventory") pod "73a5aa76-8e8f-4235-bd0d-294f718698fa" (UID: "73a5aa76-8e8f-4235-bd0d-294f718698fa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.254921 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.254967 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjwds\" (UniqueName: \"kubernetes.io/projected/73a5aa76-8e8f-4235-bd0d-294f718698fa-kube-api-access-hjwds\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.254986 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.255004 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/73a5aa76-8e8f-4235-bd0d-294f718698fa-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.544769 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" event={"ID":"73a5aa76-8e8f-4235-bd0d-294f718698fa","Type":"ContainerDied","Data":"feb9c238dfa2d325536cf0d59da45a3a354dfd03f841db94b04d9035676d0d5c"} Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.544832 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="feb9c238dfa2d325536cf0d59da45a3a354dfd03f841db94b04d9035676d0d5c" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.544912 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fwcg9" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.678545 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-gpz7d"] Feb 20 09:13:06 crc kubenswrapper[5094]: E0220 09:13:06.679006 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73a5aa76-8e8f-4235-bd0d-294f718698fa" containerName="run-os-openstack-openstack-cell1" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.679025 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="73a5aa76-8e8f-4235-bd0d-294f718698fa" containerName="run-os-openstack-openstack-cell1" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.679227 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="73a5aa76-8e8f-4235-bd0d-294f718698fa" containerName="run-os-openstack-openstack-cell1" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.679936 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.683078 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.683247 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.694161 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-gpz7d"] Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.763863 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.763944 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.763996 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.764385 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.866672 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.866805 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.866855 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.866975 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.870501 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.870782 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.871016 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:06 crc kubenswrapper[5094]: I0220 09:13:06.885193 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") pod \"reboot-os-openstack-openstack-cell1-gpz7d\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:07 crc kubenswrapper[5094]: I0220 09:13:06.999901 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:07 crc kubenswrapper[5094]: I0220 09:13:07.569690 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-gpz7d"] Feb 20 09:13:08 crc kubenswrapper[5094]: I0220 09:13:08.564967 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" event={"ID":"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e","Type":"ContainerStarted","Data":"535d1f9c00da485e84fcf5b15dfd55c8a8dda766b04ae2351a6d6b1fe7889e10"} Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.130746 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.130840 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.195017 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.576967 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" event={"ID":"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e","Type":"ContainerStarted","Data":"ed8526b86ab62d7bd0921b55737c1c6fd30ec623adb1a48a2a35b5434303c21d"} Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.610893 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" podStartSLOduration=2.816515888 podStartE2EDuration="3.610872218s" podCreationTimestamp="2026-02-20 09:13:06 +0000 UTC" firstStartedPulling="2026-02-20 09:13:07.575231234 +0000 UTC m=+8802.447878836" lastFinishedPulling="2026-02-20 09:13:08.369608455 +0000 UTC m=+8803.242235166" observedRunningTime="2026-02-20 09:13:09.594018313 +0000 UTC m=+8804.466645034" watchObservedRunningTime="2026-02-20 09:13:09.610872218 +0000 UTC m=+8804.483498939" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.639959 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:09 crc kubenswrapper[5094]: I0220 09:13:09.704287 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:13:11 crc kubenswrapper[5094]: I0220 09:13:11.594795 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-swkr9" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="registry-server" containerID="cri-o://9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" gracePeriod=2 Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.081327 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.193887 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") pod \"18ea844c-6c90-4428-a5af-073c2b5500b6\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.194024 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") pod \"18ea844c-6c90-4428-a5af-073c2b5500b6\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.194226 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") pod \"18ea844c-6c90-4428-a5af-073c2b5500b6\" (UID: \"18ea844c-6c90-4428-a5af-073c2b5500b6\") " Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.194754 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities" (OuterVolumeSpecName: "utilities") pod "18ea844c-6c90-4428-a5af-073c2b5500b6" (UID: "18ea844c-6c90-4428-a5af-073c2b5500b6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.204384 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9" (OuterVolumeSpecName: "kube-api-access-vd4d9") pod "18ea844c-6c90-4428-a5af-073c2b5500b6" (UID: "18ea844c-6c90-4428-a5af-073c2b5500b6"). InnerVolumeSpecName "kube-api-access-vd4d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.252864 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "18ea844c-6c90-4428-a5af-073c2b5500b6" (UID: "18ea844c-6c90-4428-a5af-073c2b5500b6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.297008 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd4d9\" (UniqueName: \"kubernetes.io/projected/18ea844c-6c90-4428-a5af-073c2b5500b6-kube-api-access-vd4d9\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.297040 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.297052 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18ea844c-6c90-4428-a5af-073c2b5500b6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.605793 5094 generic.go:334] "Generic (PLEG): container finished" podID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerID="9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" exitCode=0 Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.606802 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerDied","Data":"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa"} Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.606861 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-swkr9" event={"ID":"18ea844c-6c90-4428-a5af-073c2b5500b6","Type":"ContainerDied","Data":"c914d995a63fd7594c0e4bdb37a076bf16b8dc6dce300cea3fef7134aad270b1"} Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.606881 5094 scope.go:117] "RemoveContainer" containerID="9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.607047 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-swkr9" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.633919 5094 scope.go:117] "RemoveContainer" containerID="cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.651448 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.663561 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-swkr9"] Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.676902 5094 scope.go:117] "RemoveContainer" containerID="d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.721945 5094 scope.go:117] "RemoveContainer" containerID="9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" Feb 20 09:13:12 crc kubenswrapper[5094]: E0220 09:13:12.722307 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa\": container with ID starting with 9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa not found: ID does not exist" containerID="9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722338 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa"} err="failed to get container status \"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa\": rpc error: code = NotFound desc = could not find container \"9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa\": container with ID starting with 9e59a5dfa0b9d46498e6efd644ef2e28a2e6a9be1df2122d4e2c68bb612de0aa not found: ID does not exist" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722360 5094 scope.go:117] "RemoveContainer" containerID="cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d" Feb 20 09:13:12 crc kubenswrapper[5094]: E0220 09:13:12.722557 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d\": container with ID starting with cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d not found: ID does not exist" containerID="cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722581 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d"} err="failed to get container status \"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d\": rpc error: code = NotFound desc = could not find container \"cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d\": container with ID starting with cc78476448bdfbfe0b18fb3660e5595ec12eeec57060e43709c41f959f215d8d not found: ID does not exist" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722594 5094 scope.go:117] "RemoveContainer" containerID="d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0" Feb 20 09:13:12 crc kubenswrapper[5094]: E0220 09:13:12.722953 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0\": container with ID starting with d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0 not found: ID does not exist" containerID="d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0" Feb 20 09:13:12 crc kubenswrapper[5094]: I0220 09:13:12.722976 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0"} err="failed to get container status \"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0\": rpc error: code = NotFound desc = could not find container \"d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0\": container with ID starting with d55a313737dc1834267e2d5132e8dc3b4582b8466b0451ffea87f75b85b148d0 not found: ID does not exist" Feb 20 09:13:13 crc kubenswrapper[5094]: I0220 09:13:13.852597 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" path="/var/lib/kubelet/pods/18ea844c-6c90-4428-a5af-073c2b5500b6/volumes" Feb 20 09:13:23 crc kubenswrapper[5094]: I0220 09:13:23.756807 5094 generic.go:334] "Generic (PLEG): container finished" podID="134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" containerID="ed8526b86ab62d7bd0921b55737c1c6fd30ec623adb1a48a2a35b5434303c21d" exitCode=0 Feb 20 09:13:23 crc kubenswrapper[5094]: I0220 09:13:23.756863 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" event={"ID":"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e","Type":"ContainerDied","Data":"ed8526b86ab62d7bd0921b55737c1c6fd30ec623adb1a48a2a35b5434303c21d"} Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.240449 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.284231 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") pod \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.284449 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") pod \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.284474 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") pod \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.284614 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") pod \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\" (UID: \"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e\") " Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.290736 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z" (OuterVolumeSpecName: "kube-api-access-s5t7z") pod "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" (UID: "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e"). InnerVolumeSpecName "kube-api-access-s5t7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.294825 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph" (OuterVolumeSpecName: "ceph") pod "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" (UID: "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.313763 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" (UID: "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.315431 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory" (OuterVolumeSpecName: "inventory") pod "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" (UID: "134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.386788 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.386840 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.386850 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.386880 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5t7z\" (UniqueName: \"kubernetes.io/projected/134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e-kube-api-access-s5t7z\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.785906 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" event={"ID":"134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e","Type":"ContainerDied","Data":"535d1f9c00da485e84fcf5b15dfd55c8a8dda766b04ae2351a6d6b1fe7889e10"} Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.785974 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-gpz7d" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.785978 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="535d1f9c00da485e84fcf5b15dfd55c8a8dda766b04ae2351a6d6b1fe7889e10" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.906262 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8nqcz"] Feb 20 09:13:25 crc kubenswrapper[5094]: E0220 09:13:25.906927 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" containerName="reboot-os-openstack-openstack-cell1" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907008 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" containerName="reboot-os-openstack-openstack-cell1" Feb 20 09:13:25 crc kubenswrapper[5094]: E0220 09:13:25.907086 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="extract-utilities" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907150 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="extract-utilities" Feb 20 09:13:25 crc kubenswrapper[5094]: E0220 09:13:25.907212 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="registry-server" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907267 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="registry-server" Feb 20 09:13:25 crc kubenswrapper[5094]: E0220 09:13:25.907340 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="extract-content" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907391 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="extract-content" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907635 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="18ea844c-6c90-4428-a5af-073c2b5500b6" containerName="registry-server" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.907733 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e" containerName="reboot-os-openstack-openstack-cell1" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.908515 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.912668 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.912846 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.919879 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8nqcz"] Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998145 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998227 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998338 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998649 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998741 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.998941 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999055 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999228 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999258 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999334 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:25 crc kubenswrapper[5094]: I0220 09:13:25.999386 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102032 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102219 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102308 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102347 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102400 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102649 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102676 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.102744 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.106225 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.108362 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.108509 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.109216 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.109327 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.110147 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.111580 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.112217 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.112298 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.113029 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.117356 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.120604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.122081 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") pod \"install-certs-openstack-openstack-cell1-8nqcz\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.270627 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.277793 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:26 crc kubenswrapper[5094]: I0220 09:13:26.977157 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8nqcz"] Feb 20 09:13:26 crc kubenswrapper[5094]: W0220 09:13:26.982054 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f8dd6dc_a03c_4873_8ecb_e23bc464edff.slice/crio-b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51 WatchSource:0}: Error finding container b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51: Status 404 returned error can't find the container with id b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51 Feb 20 09:13:27 crc kubenswrapper[5094]: I0220 09:13:27.811570 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" event={"ID":"8f8dd6dc-a03c-4873-8ecb-e23bc464edff","Type":"ContainerStarted","Data":"7c96506019fea93df882e932dad5dd6521b64c73702e1151d7089d024cc92c02"} Feb 20 09:13:27 crc kubenswrapper[5094]: I0220 09:13:27.811960 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" event={"ID":"8f8dd6dc-a03c-4873-8ecb-e23bc464edff","Type":"ContainerStarted","Data":"b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51"} Feb 20 09:13:27 crc kubenswrapper[5094]: I0220 09:13:27.846229 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" podStartSLOduration=2.452895213 podStartE2EDuration="2.846203975s" podCreationTimestamp="2026-02-20 09:13:25 +0000 UTC" firstStartedPulling="2026-02-20 09:13:26.984305379 +0000 UTC m=+8821.856932120" lastFinishedPulling="2026-02-20 09:13:27.377614161 +0000 UTC m=+8822.250240882" observedRunningTime="2026-02-20 09:13:27.830756203 +0000 UTC m=+8822.703382934" watchObservedRunningTime="2026-02-20 09:13:27.846203975 +0000 UTC m=+8822.718830726" Feb 20 09:13:47 crc kubenswrapper[5094]: I0220 09:13:47.005011 5094 generic.go:334] "Generic (PLEG): container finished" podID="8f8dd6dc-a03c-4873-8ecb-e23bc464edff" containerID="7c96506019fea93df882e932dad5dd6521b64c73702e1151d7089d024cc92c02" exitCode=0 Feb 20 09:13:47 crc kubenswrapper[5094]: I0220 09:13:47.005108 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" event={"ID":"8f8dd6dc-a03c-4873-8ecb-e23bc464edff","Type":"ContainerDied","Data":"7c96506019fea93df882e932dad5dd6521b64c73702e1151d7089d024cc92c02"} Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.477915 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.527788 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.527863 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.527935 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.527972 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528016 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528074 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528111 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528146 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528190 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528320 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528432 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.528514 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") pod \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\" (UID: \"8f8dd6dc-a03c-4873-8ecb-e23bc464edff\") " Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.534423 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.534679 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.534807 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk" (OuterVolumeSpecName: "kube-api-access-z2gtk") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "kube-api-access-z2gtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.535996 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.536506 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.536588 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.536890 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.537138 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.538873 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.542005 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph" (OuterVolumeSpecName: "ceph") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.563084 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory" (OuterVolumeSpecName: "inventory") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.568185 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8f8dd6dc-a03c-4873-8ecb-e23bc464edff" (UID: "8f8dd6dc-a03c-4873-8ecb-e23bc464edff"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635332 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635488 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635591 5094 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635683 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635808 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635913 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2gtk\" (UniqueName: \"kubernetes.io/projected/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-kube-api-access-z2gtk\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.635995 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636081 5094 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636177 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636262 5094 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636340 5094 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:48 crc kubenswrapper[5094]: I0220 09:13:48.636430 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8f8dd6dc-a03c-4873-8ecb-e23bc464edff-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.030121 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" event={"ID":"8f8dd6dc-a03c-4873-8ecb-e23bc464edff","Type":"ContainerDied","Data":"b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51"} Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.030477 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1b01afdc89e7344aaf971d11d95305dbae7f41f92c9a69461b5024f2ebf7c51" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.030217 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8nqcz" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.202494 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-ffl97"] Feb 20 09:13:49 crc kubenswrapper[5094]: E0220 09:13:49.203322 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f8dd6dc-a03c-4873-8ecb-e23bc464edff" containerName="install-certs-openstack-openstack-cell1" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.203468 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f8dd6dc-a03c-4873-8ecb-e23bc464edff" containerName="install-certs-openstack-openstack-cell1" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.203912 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f8dd6dc-a03c-4873-8ecb-e23bc464edff" containerName="install-certs-openstack-openstack-cell1" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.204884 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.210696 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.210766 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.213818 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-ffl97"] Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.253313 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.253406 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.253450 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.253538 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.355286 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.355419 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.355470 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.355507 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.360872 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.360907 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.374149 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.375011 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") pod \"ceph-client-openstack-openstack-cell1-ffl97\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:49 crc kubenswrapper[5094]: I0220 09:13:49.522288 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:50 crc kubenswrapper[5094]: I0220 09:13:50.130000 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-ffl97"] Feb 20 09:13:51 crc kubenswrapper[5094]: I0220 09:13:51.054235 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" event={"ID":"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db","Type":"ContainerStarted","Data":"e7a67200ded8823653b016a9ba2b790f20cb85685e7f8e7f60228125c6bf1e16"} Feb 20 09:13:51 crc kubenswrapper[5094]: I0220 09:13:51.055440 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" event={"ID":"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db","Type":"ContainerStarted","Data":"639da57cbc720668a4113926cc775bd9a7a1d7c0bf9ce364b0e700fc1cee12ad"} Feb 20 09:13:51 crc kubenswrapper[5094]: I0220 09:13:51.076529 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" podStartSLOduration=1.669723394 podStartE2EDuration="2.076511311s" podCreationTimestamp="2026-02-20 09:13:49 +0000 UTC" firstStartedPulling="2026-02-20 09:13:50.134444097 +0000 UTC m=+8845.007070818" lastFinishedPulling="2026-02-20 09:13:50.541232024 +0000 UTC m=+8845.413858735" observedRunningTime="2026-02-20 09:13:51.072652138 +0000 UTC m=+8845.945278849" watchObservedRunningTime="2026-02-20 09:13:51.076511311 +0000 UTC m=+8845.949138022" Feb 20 09:13:52 crc kubenswrapper[5094]: I0220 09:13:52.063742 5094 generic.go:334] "Generic (PLEG): container finished" podID="3ef4f2ef-92a7-4d12-94a9-e3ee55412547" containerID="a7bba99fdce7206f2a33a9b6752071dfb931fd478046625f8eac18d1bd63f378" exitCode=0 Feb 20 09:13:52 crc kubenswrapper[5094]: I0220 09:13:52.063813 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-r6vzk" event={"ID":"3ef4f2ef-92a7-4d12-94a9-e3ee55412547","Type":"ContainerDied","Data":"a7bba99fdce7206f2a33a9b6752071dfb931fd478046625f8eac18d1bd63f378"} Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.597410 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.751763 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.751893 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.751940 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.752076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.752098 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") pod \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\" (UID: \"3ef4f2ef-92a7-4d12-94a9-e3ee55412547\") " Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.763364 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.767940 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd" (OuterVolumeSpecName: "kube-api-access-znpgd") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "kube-api-access-znpgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.793533 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.799615 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.809757 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory" (OuterVolumeSpecName: "inventory") pod "3ef4f2ef-92a7-4d12-94a9-e3ee55412547" (UID: "3ef4f2ef-92a7-4d12-94a9-e3ee55412547"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854839 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854882 5094 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854893 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znpgd\" (UniqueName: \"kubernetes.io/projected/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-kube-api-access-znpgd\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854953 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:53 crc kubenswrapper[5094]: I0220 09:13:53.854963 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef4f2ef-92a7-4d12-94a9-e3ee55412547-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.091451 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-networker-r6vzk" event={"ID":"3ef4f2ef-92a7-4d12-94a9-e3ee55412547","Type":"ContainerDied","Data":"e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b"} Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.091492 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1588c72c435155f35fdf02546c5ae6b532e58027baa39a785ec6d828ecdf55b" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.091531 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-networker-r6vzk" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.189955 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-f5lls"] Feb 20 09:13:54 crc kubenswrapper[5094]: E0220 09:13:54.190635 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef4f2ef-92a7-4d12-94a9-e3ee55412547" containerName="ovn-openstack-openstack-networker" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.190665 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef4f2ef-92a7-4d12-94a9-e3ee55412547" containerName="ovn-openstack-openstack-networker" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.191222 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef4f2ef-92a7-4d12-94a9-e3ee55412547" containerName="ovn-openstack-openstack-networker" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.192426 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.195342 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-networker" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.195413 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-networker-dockercfg-xf9xl" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.197844 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.197845 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.208973 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-f5lls"] Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.264442 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.264794 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.265105 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.265259 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.265347 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.265471 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367254 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367337 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367393 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367432 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367473 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.367511 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.371507 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.371591 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.372305 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.373600 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.375920 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.390195 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") pod \"neutron-metadata-openstack-openstack-networker-f5lls\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:54 crc kubenswrapper[5094]: I0220 09:13:54.511337 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:13:55 crc kubenswrapper[5094]: W0220 09:13:55.046931 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb741c1f4_f408_486b_bd44_3ae1fcadc83b.slice/crio-cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2 WatchSource:0}: Error finding container cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2: Status 404 returned error can't find the container with id cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2 Feb 20 09:13:55 crc kubenswrapper[5094]: I0220 09:13:55.053069 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-networker-f5lls"] Feb 20 09:13:55 crc kubenswrapper[5094]: I0220 09:13:55.100754 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" event={"ID":"b741c1f4-f408-486b-bd44-3ae1fcadc83b","Type":"ContainerStarted","Data":"cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2"} Feb 20 09:13:56 crc kubenswrapper[5094]: I0220 09:13:56.112070 5094 generic.go:334] "Generic (PLEG): container finished" podID="e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" containerID="e7a67200ded8823653b016a9ba2b790f20cb85685e7f8e7f60228125c6bf1e16" exitCode=0 Feb 20 09:13:56 crc kubenswrapper[5094]: I0220 09:13:56.112190 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" event={"ID":"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db","Type":"ContainerDied","Data":"e7a67200ded8823653b016a9ba2b790f20cb85685e7f8e7f60228125c6bf1e16"} Feb 20 09:13:56 crc kubenswrapper[5094]: I0220 09:13:56.115325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" event={"ID":"b741c1f4-f408-486b-bd44-3ae1fcadc83b","Type":"ContainerStarted","Data":"60033d64494b24ef736d90599a6c3267ec69b6bcf9cf95209691bf9761ff5b97"} Feb 20 09:13:56 crc kubenswrapper[5094]: I0220 09:13:56.154722 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" podStartSLOduration=1.736795469 podStartE2EDuration="2.154687112s" podCreationTimestamp="2026-02-20 09:13:54 +0000 UTC" firstStartedPulling="2026-02-20 09:13:55.049940604 +0000 UTC m=+8849.922567335" lastFinishedPulling="2026-02-20 09:13:55.467832257 +0000 UTC m=+8850.340458978" observedRunningTime="2026-02-20 09:13:56.148222856 +0000 UTC m=+8851.020849587" watchObservedRunningTime="2026-02-20 09:13:56.154687112 +0000 UTC m=+8851.027313823" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.614765 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.745821 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") pod \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.746236 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") pod \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.746402 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") pod \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.746490 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") pod \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\" (UID: \"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db\") " Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.751537 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc" (OuterVolumeSpecName: "kube-api-access-mvklc") pod "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" (UID: "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db"). InnerVolumeSpecName "kube-api-access-mvklc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.751956 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph" (OuterVolumeSpecName: "ceph") pod "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" (UID: "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.785999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory" (OuterVolumeSpecName: "inventory") pod "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" (UID: "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.811983 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" (UID: "e01778d5-c4a7-44c6-a9e9-cf7d3cb299db"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.856986 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.857130 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.857150 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvklc\" (UniqueName: \"kubernetes.io/projected/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-kube-api-access-mvklc\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:57 crc kubenswrapper[5094]: I0220 09:13:57.857474 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e01778d5-c4a7-44c6-a9e9-cf7d3cb299db-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.137832 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" event={"ID":"e01778d5-c4a7-44c6-a9e9-cf7d3cb299db","Type":"ContainerDied","Data":"639da57cbc720668a4113926cc775bd9a7a1d7c0bf9ce364b0e700fc1cee12ad"} Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.137875 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="639da57cbc720668a4113926cc775bd9a7a1d7c0bf9ce364b0e700fc1cee12ad" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.137959 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-ffl97" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.283625 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pswsx"] Feb 20 09:13:58 crc kubenswrapper[5094]: E0220 09:13:58.284260 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" containerName="ceph-client-openstack-openstack-cell1" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.284283 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" containerName="ceph-client-openstack-openstack-cell1" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.284572 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e01778d5-c4a7-44c6-a9e9-cf7d3cb299db" containerName="ceph-client-openstack-openstack-cell1" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.285489 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.288562 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.289624 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.290131 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.301521 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pswsx"] Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369286 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369376 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369491 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369534 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369579 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.369627 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471168 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471232 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471293 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471362 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471458 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.471538 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.472753 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.475364 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.476330 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.476476 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.477026 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.492228 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") pod \"ovn-openstack-openstack-cell1-pswsx\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:58 crc kubenswrapper[5094]: I0220 09:13:58.610417 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:13:59 crc kubenswrapper[5094]: I0220 09:13:59.161393 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-pswsx"] Feb 20 09:13:59 crc kubenswrapper[5094]: W0220 09:13:59.162191 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f35b6d1_3070_44cf_bdf8_6376b2434586.slice/crio-cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30 WatchSource:0}: Error finding container cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30: Status 404 returned error can't find the container with id cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30 Feb 20 09:14:00 crc kubenswrapper[5094]: I0220 09:14:00.163055 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pswsx" event={"ID":"3f35b6d1-3070-44cf-bdf8-6376b2434586","Type":"ContainerStarted","Data":"3c888b2b05fec27e60745bb98c9fc87f20df720e5a76a1cf1200d076a1e8a640"} Feb 20 09:14:00 crc kubenswrapper[5094]: I0220 09:14:00.163414 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pswsx" event={"ID":"3f35b6d1-3070-44cf-bdf8-6376b2434586","Type":"ContainerStarted","Data":"cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30"} Feb 20 09:14:00 crc kubenswrapper[5094]: I0220 09:14:00.186597 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-pswsx" podStartSLOduration=1.762382777 podStartE2EDuration="2.186576812s" podCreationTimestamp="2026-02-20 09:13:58 +0000 UTC" firstStartedPulling="2026-02-20 09:13:59.16647151 +0000 UTC m=+8854.039098231" lastFinishedPulling="2026-02-20 09:13:59.590665545 +0000 UTC m=+8854.463292266" observedRunningTime="2026-02-20 09:14:00.183079467 +0000 UTC m=+8855.055706178" watchObservedRunningTime="2026-02-20 09:14:00.186576812 +0000 UTC m=+8855.059203523" Feb 20 09:14:04 crc kubenswrapper[5094]: I0220 09:14:04.106807 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:14:04 crc kubenswrapper[5094]: I0220 09:14:04.107943 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:14:34 crc kubenswrapper[5094]: I0220 09:14:34.106881 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:14:34 crc kubenswrapper[5094]: I0220 09:14:34.107352 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:14:49 crc kubenswrapper[5094]: I0220 09:14:49.653038 5094 generic.go:334] "Generic (PLEG): container finished" podID="b741c1f4-f408-486b-bd44-3ae1fcadc83b" containerID="60033d64494b24ef736d90599a6c3267ec69b6bcf9cf95209691bf9761ff5b97" exitCode=0 Feb 20 09:14:49 crc kubenswrapper[5094]: I0220 09:14:49.653100 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" event={"ID":"b741c1f4-f408-486b-bd44-3ae1fcadc83b","Type":"ContainerDied","Data":"60033d64494b24ef736d90599a6c3267ec69b6bcf9cf95209691bf9761ff5b97"} Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.120495 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231686 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231752 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231870 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231887 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.231948 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.232057 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") pod \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\" (UID: \"b741c1f4-f408-486b-bd44-3ae1fcadc83b\") " Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.238914 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.241523 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln" (OuterVolumeSpecName: "kube-api-access-lcvln") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "kube-api-access-lcvln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.269646 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.282935 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory" (OuterVolumeSpecName: "inventory") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.293829 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.303772 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker" (OuterVolumeSpecName: "ssh-key-openstack-networker") pod "b741c1f4-f408-486b-bd44-3ae1fcadc83b" (UID: "b741c1f4-f408-486b-bd44-3ae1fcadc83b"). InnerVolumeSpecName "ssh-key-openstack-networker". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.334853 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcvln\" (UniqueName: \"kubernetes.io/projected/b741c1f4-f408-486b-bd44-3ae1fcadc83b-kube-api-access-lcvln\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335144 5094 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335250 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335337 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335422 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-networker\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-ssh-key-openstack-networker\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.335505 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b741c1f4-f408-486b-bd44-3ae1fcadc83b-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.677355 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" event={"ID":"b741c1f4-f408-486b-bd44-3ae1fcadc83b","Type":"ContainerDied","Data":"cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2"} Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.677400 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb81625e8207eb6bdb083e10cef3efc13f9b1eb6ced3c8983766d66a97d86cd2" Feb 20 09:14:51 crc kubenswrapper[5094]: I0220 09:14:51.677934 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-networker-f5lls" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.150163 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 09:15:00 crc kubenswrapper[5094]: E0220 09:15:00.151231 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b741c1f4-f408-486b-bd44-3ae1fcadc83b" containerName="neutron-metadata-openstack-openstack-networker" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.151250 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b741c1f4-f408-486b-bd44-3ae1fcadc83b" containerName="neutron-metadata-openstack-openstack-networker" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.151531 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b741c1f4-f408-486b-bd44-3ae1fcadc83b" containerName="neutron-metadata-openstack-openstack-networker" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.152926 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.154937 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.155777 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.161864 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.221002 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.221128 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.221377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.323638 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.323752 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.323903 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.325091 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.330412 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.343467 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") pod \"collect-profiles-29526315-md8jm\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.505448 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:00 crc kubenswrapper[5094]: I0220 09:15:00.992627 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 09:15:01 crc kubenswrapper[5094]: I0220 09:15:01.769411 5094 generic.go:334] "Generic (PLEG): container finished" podID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" containerID="54cabe5f22fe8cc34888629b6ad81ec4c78f22e5ddf13beea44532e8ad37533e" exitCode=0 Feb 20 09:15:01 crc kubenswrapper[5094]: I0220 09:15:01.769473 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" event={"ID":"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1","Type":"ContainerDied","Data":"54cabe5f22fe8cc34888629b6ad81ec4c78f22e5ddf13beea44532e8ad37533e"} Feb 20 09:15:01 crc kubenswrapper[5094]: I0220 09:15:01.769817 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" event={"ID":"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1","Type":"ContainerStarted","Data":"f6e87b5b2102f7eaa839fc05ee79c8bec2aebcbb17ddd051d0ff98aa9943e799"} Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.244091 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.414676 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") pod \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.414772 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") pod \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.414864 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") pod \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\" (UID: \"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1\") " Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.415443 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume" (OuterVolumeSpecName: "config-volume") pod "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" (UID: "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.416099 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.420348 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9" (OuterVolumeSpecName: "kube-api-access-pbxs9") pod "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" (UID: "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1"). InnerVolumeSpecName "kube-api-access-pbxs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.421889 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" (UID: "ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.517730 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbxs9\" (UniqueName: \"kubernetes.io/projected/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-kube-api-access-pbxs9\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.517762 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.798497 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" event={"ID":"ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1","Type":"ContainerDied","Data":"f6e87b5b2102f7eaa839fc05ee79c8bec2aebcbb17ddd051d0ff98aa9943e799"} Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.798545 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6e87b5b2102f7eaa839fc05ee79c8bec2aebcbb17ddd051d0ff98aa9943e799" Feb 20 09:15:03 crc kubenswrapper[5094]: I0220 09:15:03.798898 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm" Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.106575 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.106910 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.107023 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.107901 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.108047 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf" gracePeriod=600 Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.336193 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.353481 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526270-cm5bh"] Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.829767 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf" exitCode=0 Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.830287 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf"} Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.830322 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0"} Feb 20 09:15:04 crc kubenswrapper[5094]: I0220 09:15:04.830344 5094 scope.go:117] "RemoveContainer" containerID="d9db0e55a345ff51ffcdbd143ba1e85dc599b4355444e1c3fc8b0ec029e4ca60" Feb 20 09:15:05 crc kubenswrapper[5094]: I0220 09:15:05.853912 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c7c75bd-9812-4d90-80ea-08eda0f926fc" path="/var/lib/kubelet/pods/5c7c75bd-9812-4d90-80ea-08eda0f926fc/volumes" Feb 20 09:15:09 crc kubenswrapper[5094]: I0220 09:15:09.904162 5094 generic.go:334] "Generic (PLEG): container finished" podID="3f35b6d1-3070-44cf-bdf8-6376b2434586" containerID="3c888b2b05fec27e60745bb98c9fc87f20df720e5a76a1cf1200d076a1e8a640" exitCode=0 Feb 20 09:15:09 crc kubenswrapper[5094]: I0220 09:15:09.904278 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pswsx" event={"ID":"3f35b6d1-3070-44cf-bdf8-6376b2434586","Type":"ContainerDied","Data":"3c888b2b05fec27e60745bb98c9fc87f20df720e5a76a1cf1200d076a1e8a640"} Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.484754 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.635342 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.635701 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.635889 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.635959 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.636020 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.636190 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") pod \"3f35b6d1-3070-44cf-bdf8-6376b2434586\" (UID: \"3f35b6d1-3070-44cf-bdf8-6376b2434586\") " Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.641561 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph" (OuterVolumeSpecName: "ceph") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.643124 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.650243 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r" (OuterVolumeSpecName: "kube-api-access-5sx9r") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "kube-api-access-5sx9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.668541 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.672045 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.675865 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory" (OuterVolumeSpecName: "inventory") pod "3f35b6d1-3070-44cf-bdf8-6376b2434586" (UID: "3f35b6d1-3070-44cf-bdf8-6376b2434586"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739578 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sx9r\" (UniqueName: \"kubernetes.io/projected/3f35b6d1-3070-44cf-bdf8-6376b2434586-kube-api-access-5sx9r\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739624 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739640 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739654 5094 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3f35b6d1-3070-44cf-bdf8-6376b2434586-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739667 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.739681 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/3f35b6d1-3070-44cf-bdf8-6376b2434586-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.929818 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-pswsx" event={"ID":"3f35b6d1-3070-44cf-bdf8-6376b2434586","Type":"ContainerDied","Data":"cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30"} Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.929862 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdc3d1621bbe1a4ebe64d157d01d74cdbb9d3ff91d96fb9d22cab8643dd41f30" Feb 20 09:15:11 crc kubenswrapper[5094]: I0220 09:15:11.929921 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-pswsx" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.014416 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-d2d4n"] Feb 20 09:15:12 crc kubenswrapper[5094]: E0220 09:15:12.014933 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" containerName="collect-profiles" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.014956 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" containerName="collect-profiles" Feb 20 09:15:12 crc kubenswrapper[5094]: E0220 09:15:12.014980 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f35b6d1-3070-44cf-bdf8-6376b2434586" containerName="ovn-openstack-openstack-cell1" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.014989 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f35b6d1-3070-44cf-bdf8-6376b2434586" containerName="ovn-openstack-openstack-cell1" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.015224 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f35b6d1-3070-44cf-bdf8-6376b2434586" containerName="ovn-openstack-openstack-cell1" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.015245 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" containerName="collect-profiles" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.016079 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.019804 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.020004 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.020009 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.020668 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.020949 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.022121 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.036589 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-d2d4n"] Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151705 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151768 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151808 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151956 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.151991 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.152025 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.152050 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253443 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253499 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253624 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253644 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.253679 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.258638 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.258638 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.260046 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.261354 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.261470 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.262033 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.274395 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") pod \"neutron-metadata-openstack-openstack-cell1-d2d4n\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.344669 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.927453 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-d2d4n"] Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.932388 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:15:12 crc kubenswrapper[5094]: I0220 09:15:12.943628 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" event={"ID":"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe","Type":"ContainerStarted","Data":"eced22e9311585fc8a9e80929bb7e4e1c9a1f94226fce854695e14c779761fb4"} Feb 20 09:15:13 crc kubenswrapper[5094]: I0220 09:15:13.955626 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" event={"ID":"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe","Type":"ContainerStarted","Data":"2d9f8b5996e148deeb14413be699a051f445a16100af15b499c642d8cd36aaf1"} Feb 20 09:15:13 crc kubenswrapper[5094]: I0220 09:15:13.975679 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" podStartSLOduration=2.534631268 podStartE2EDuration="2.975664488s" podCreationTimestamp="2026-02-20 09:15:11 +0000 UTC" firstStartedPulling="2026-02-20 09:15:12.932207885 +0000 UTC m=+8927.804834596" lastFinishedPulling="2026-02-20 09:15:13.373241105 +0000 UTC m=+8928.245867816" observedRunningTime="2026-02-20 09:15:13.97368419 +0000 UTC m=+8928.846310901" watchObservedRunningTime="2026-02-20 09:15:13.975664488 +0000 UTC m=+8928.848291199" Feb 20 09:15:44 crc kubenswrapper[5094]: I0220 09:15:44.069900 5094 scope.go:117] "RemoveContainer" containerID="89a68b3798c7e61a71c5a1f766e1642edc8983858caba5c4db74959c3a8cdcec" Feb 20 09:16:09 crc kubenswrapper[5094]: I0220 09:16:09.551241 5094 generic.go:334] "Generic (PLEG): container finished" podID="bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" containerID="2d9f8b5996e148deeb14413be699a051f445a16100af15b499c642d8cd36aaf1" exitCode=0 Feb 20 09:16:09 crc kubenswrapper[5094]: I0220 09:16:09.551329 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" event={"ID":"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe","Type":"ContainerDied","Data":"2d9f8b5996e148deeb14413be699a051f445a16100af15b499c642d8cd36aaf1"} Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.088580 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205644 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205677 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205882 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205916 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205949 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.205993 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") pod \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\" (UID: \"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe\") " Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.214854 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph" (OuterVolumeSpecName: "ceph") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.214882 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.214905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q" (OuterVolumeSpecName: "kube-api-access-jtb5q") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "kube-api-access-jtb5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.238423 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.249914 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308123 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308167 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308179 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtb5q\" (UniqueName: \"kubernetes.io/projected/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-kube-api-access-jtb5q\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308190 5094 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.308198 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.345927 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory" (OuterVolumeSpecName: "inventory") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.362687 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" (UID: "bf84fab1-aae6-4c92-982e-a4c5b1c7cefe"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.410523 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.410573 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bf84fab1-aae6-4c92-982e-a4c5b1c7cefe-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.571125 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" event={"ID":"bf84fab1-aae6-4c92-982e-a4c5b1c7cefe","Type":"ContainerDied","Data":"eced22e9311585fc8a9e80929bb7e4e1c9a1f94226fce854695e14c779761fb4"} Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.571602 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eced22e9311585fc8a9e80929bb7e4e1c9a1f94226fce854695e14c779761fb4" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.571202 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-d2d4n" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.684836 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-f6rmf"] Feb 20 09:16:11 crc kubenswrapper[5094]: E0220 09:16:11.685470 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" containerName="neutron-metadata-openstack-openstack-cell1" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.685506 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" containerName="neutron-metadata-openstack-openstack-cell1" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.685877 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf84fab1-aae6-4c92-982e-a4c5b1c7cefe" containerName="neutron-metadata-openstack-openstack-cell1" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.686833 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.688313 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.688312 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.688949 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.689019 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.689663 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.703924 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-f6rmf"] Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.819740 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.819801 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.819836 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.819970 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.820044 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.820272 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921684 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921777 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921841 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921886 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921929 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.921970 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.926984 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.927150 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.928173 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.928672 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.929139 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:11 crc kubenswrapper[5094]: I0220 09:16:11.937692 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") pod \"libvirt-openstack-openstack-cell1-f6rmf\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:12 crc kubenswrapper[5094]: I0220 09:16:12.012456 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:16:12 crc kubenswrapper[5094]: I0220 09:16:12.544803 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-f6rmf"] Feb 20 09:16:12 crc kubenswrapper[5094]: I0220 09:16:12.579919 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" event={"ID":"a552adeb-5834-4cfe-8ee3-56472dda5cab","Type":"ContainerStarted","Data":"27d38e30a80a5d08e54616c5028c63d34bbc4b444a5bebcc5405786abf7e2ca0"} Feb 20 09:16:13 crc kubenswrapper[5094]: I0220 09:16:13.589629 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" event={"ID":"a552adeb-5834-4cfe-8ee3-56472dda5cab","Type":"ContainerStarted","Data":"11e98b436e832bd8d5225286da85ae8eb5b7dfbad585fc8d6f918f1ffca98c48"} Feb 20 09:16:13 crc kubenswrapper[5094]: I0220 09:16:13.618586 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" podStartSLOduration=2.187018211 podStartE2EDuration="2.618538092s" podCreationTimestamp="2026-02-20 09:16:11 +0000 UTC" firstStartedPulling="2026-02-20 09:16:12.55231168 +0000 UTC m=+8987.424938391" lastFinishedPulling="2026-02-20 09:16:12.983831551 +0000 UTC m=+8987.856458272" observedRunningTime="2026-02-20 09:16:13.606592864 +0000 UTC m=+8988.479219575" watchObservedRunningTime="2026-02-20 09:16:13.618538092 +0000 UTC m=+8988.491164803" Feb 20 09:17:04 crc kubenswrapper[5094]: I0220 09:17:04.106550 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:17:04 crc kubenswrapper[5094]: I0220 09:17:04.107265 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:17:34 crc kubenswrapper[5094]: I0220 09:17:34.107298 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:17:34 crc kubenswrapper[5094]: I0220 09:17:34.108052 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.106668 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.107387 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.107475 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.108605 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.108742 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" gracePeriod=600 Feb 20 09:18:04 crc kubenswrapper[5094]: E0220 09:18:04.252946 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.825229 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" exitCode=0 Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.825292 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0"} Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.825341 5094 scope.go:117] "RemoveContainer" containerID="8248a8f47476a9f6ae0df25861a436e80d0332d121cab1f3067f00978bf0efaf" Feb 20 09:18:04 crc kubenswrapper[5094]: I0220 09:18:04.831827 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:18:04 crc kubenswrapper[5094]: E0220 09:18:04.832444 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:19 crc kubenswrapper[5094]: I0220 09:18:19.842097 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:18:19 crc kubenswrapper[5094]: E0220 09:18:19.843192 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:34 crc kubenswrapper[5094]: I0220 09:18:34.840812 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:18:34 crc kubenswrapper[5094]: E0220 09:18:34.843761 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.280222 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.284991 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.295232 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.345766 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.345841 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.345918 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.448243 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.448861 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.449005 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.449026 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.449339 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.468582 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") pod \"community-operators-g8njc\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:38 crc kubenswrapper[5094]: I0220 09:18:38.620472 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:39 crc kubenswrapper[5094]: I0220 09:18:39.154342 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:39 crc kubenswrapper[5094]: I0220 09:18:39.508456 5094 generic.go:334] "Generic (PLEG): container finished" podID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerID="fa5064b945341f85314c31fcfe63dac8cc7dacd470b3a99df5d85bd58b38d0bd" exitCode=0 Feb 20 09:18:39 crc kubenswrapper[5094]: I0220 09:18:39.508839 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerDied","Data":"fa5064b945341f85314c31fcfe63dac8cc7dacd470b3a99df5d85bd58b38d0bd"} Feb 20 09:18:39 crc kubenswrapper[5094]: I0220 09:18:39.508878 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerStarted","Data":"0137a537f9282a07e616bd1f6dc47d28ccb212e130da010ebef4074da0a74551"} Feb 20 09:18:40 crc kubenswrapper[5094]: I0220 09:18:40.527783 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerStarted","Data":"b2544b81a61b09dab625a06130adcc6de768df15aa50446f5d7958b1d23a98ed"} Feb 20 09:18:41 crc kubenswrapper[5094]: I0220 09:18:41.540719 5094 generic.go:334] "Generic (PLEG): container finished" podID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerID="b2544b81a61b09dab625a06130adcc6de768df15aa50446f5d7958b1d23a98ed" exitCode=0 Feb 20 09:18:41 crc kubenswrapper[5094]: I0220 09:18:41.540925 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerDied","Data":"b2544b81a61b09dab625a06130adcc6de768df15aa50446f5d7958b1d23a98ed"} Feb 20 09:18:42 crc kubenswrapper[5094]: I0220 09:18:42.552506 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerStarted","Data":"daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f"} Feb 20 09:18:42 crc kubenswrapper[5094]: I0220 09:18:42.576595 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g8njc" podStartSLOduration=2.087858215 podStartE2EDuration="4.576573638s" podCreationTimestamp="2026-02-20 09:18:38 +0000 UTC" firstStartedPulling="2026-02-20 09:18:39.517788129 +0000 UTC m=+9134.390414840" lastFinishedPulling="2026-02-20 09:18:42.006503522 +0000 UTC m=+9136.879130263" observedRunningTime="2026-02-20 09:18:42.566156166 +0000 UTC m=+9137.438782917" watchObservedRunningTime="2026-02-20 09:18:42.576573638 +0000 UTC m=+9137.449200349" Feb 20 09:18:45 crc kubenswrapper[5094]: I0220 09:18:45.849163 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:18:45 crc kubenswrapper[5094]: E0220 09:18:45.850147 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:18:48 crc kubenswrapper[5094]: I0220 09:18:48.621096 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:48 crc kubenswrapper[5094]: I0220 09:18:48.621674 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:48 crc kubenswrapper[5094]: I0220 09:18:48.710190 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:50 crc kubenswrapper[5094]: I0220 09:18:50.393852 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:50 crc kubenswrapper[5094]: I0220 09:18:50.454363 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:51 crc kubenswrapper[5094]: I0220 09:18:51.663209 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g8njc" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="registry-server" containerID="cri-o://daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f" gracePeriod=2 Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.679377 5094 generic.go:334] "Generic (PLEG): container finished" podID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerID="daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f" exitCode=0 Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.680908 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerDied","Data":"daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f"} Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.773161 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.958813 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") pod \"811e4fa6-3b96-40b7-88a3-067b582b0683\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.959543 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") pod \"811e4fa6-3b96-40b7-88a3-067b582b0683\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.959629 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") pod \"811e4fa6-3b96-40b7-88a3-067b582b0683\" (UID: \"811e4fa6-3b96-40b7-88a3-067b582b0683\") " Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.960425 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities" (OuterVolumeSpecName: "utilities") pod "811e4fa6-3b96-40b7-88a3-067b582b0683" (UID: "811e4fa6-3b96-40b7-88a3-067b582b0683"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:18:52 crc kubenswrapper[5094]: I0220 09:18:52.968550 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs" (OuterVolumeSpecName: "kube-api-access-s7tzs") pod "811e4fa6-3b96-40b7-88a3-067b582b0683" (UID: "811e4fa6-3b96-40b7-88a3-067b582b0683"). InnerVolumeSpecName "kube-api-access-s7tzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.021347 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "811e4fa6-3b96-40b7-88a3-067b582b0683" (UID: "811e4fa6-3b96-40b7-88a3-067b582b0683"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.061086 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.061121 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7tzs\" (UniqueName: \"kubernetes.io/projected/811e4fa6-3b96-40b7-88a3-067b582b0683-kube-api-access-s7tzs\") on node \"crc\" DevicePath \"\"" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.061134 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/811e4fa6-3b96-40b7-88a3-067b582b0683-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.695100 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g8njc" event={"ID":"811e4fa6-3b96-40b7-88a3-067b582b0683","Type":"ContainerDied","Data":"0137a537f9282a07e616bd1f6dc47d28ccb212e130da010ebef4074da0a74551"} Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.695186 5094 scope.go:117] "RemoveContainer" containerID="daa67a570fa974d5aa23afecc3d9bee3da3432d40d38bc55b9f4adbdc13c0f5f" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.695186 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g8njc" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.738882 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.749765 5094 scope.go:117] "RemoveContainer" containerID="b2544b81a61b09dab625a06130adcc6de768df15aa50446f5d7958b1d23a98ed" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.773527 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g8njc"] Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.791466 5094 scope.go:117] "RemoveContainer" containerID="fa5064b945341f85314c31fcfe63dac8cc7dacd470b3a99df5d85bd58b38d0bd" Feb 20 09:18:53 crc kubenswrapper[5094]: I0220 09:18:53.854894 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" path="/var/lib/kubelet/pods/811e4fa6-3b96-40b7-88a3-067b582b0683/volumes" Feb 20 09:19:00 crc kubenswrapper[5094]: I0220 09:19:00.841577 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:00 crc kubenswrapper[5094]: E0220 09:19:00.842922 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:19:15 crc kubenswrapper[5094]: I0220 09:19:15.856109 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:15 crc kubenswrapper[5094]: E0220 09:19:15.857497 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:19:30 crc kubenswrapper[5094]: I0220 09:19:30.840450 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:30 crc kubenswrapper[5094]: E0220 09:19:30.841261 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:19:44 crc kubenswrapper[5094]: I0220 09:19:44.840879 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:44 crc kubenswrapper[5094]: E0220 09:19:44.842629 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:19:57 crc kubenswrapper[5094]: I0220 09:19:57.840652 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:19:57 crc kubenswrapper[5094]: E0220 09:19:57.841345 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:08 crc kubenswrapper[5094]: I0220 09:20:08.841301 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:08 crc kubenswrapper[5094]: E0220 09:20:08.842407 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:19 crc kubenswrapper[5094]: I0220 09:20:19.841836 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:19 crc kubenswrapper[5094]: E0220 09:20:19.842926 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:31 crc kubenswrapper[5094]: I0220 09:20:31.840905 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:31 crc kubenswrapper[5094]: E0220 09:20:31.841940 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:44 crc kubenswrapper[5094]: I0220 09:20:44.841039 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:44 crc kubenswrapper[5094]: E0220 09:20:44.842191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:56 crc kubenswrapper[5094]: I0220 09:20:56.841567 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:20:56 crc kubenswrapper[5094]: E0220 09:20:56.844457 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:20:59 crc kubenswrapper[5094]: I0220 09:20:59.009607 5094 generic.go:334] "Generic (PLEG): container finished" podID="a552adeb-5834-4cfe-8ee3-56472dda5cab" containerID="11e98b436e832bd8d5225286da85ae8eb5b7dfbad585fc8d6f918f1ffca98c48" exitCode=0 Feb 20 09:20:59 crc kubenswrapper[5094]: I0220 09:20:59.009813 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" event={"ID":"a552adeb-5834-4cfe-8ee3-56472dda5cab","Type":"ContainerDied","Data":"11e98b436e832bd8d5225286da85ae8eb5b7dfbad585fc8d6f918f1ffca98c48"} Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.434731 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.576874 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.576935 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.577049 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.577768 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.577792 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.577964 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") pod \"a552adeb-5834-4cfe-8ee3-56472dda5cab\" (UID: \"a552adeb-5834-4cfe-8ee3-56472dda5cab\") " Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.582146 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.585906 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph" (OuterVolumeSpecName: "ceph") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.586572 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2" (OuterVolumeSpecName: "kube-api-access-f5bl2") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "kube-api-access-f5bl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.603289 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory" (OuterVolumeSpecName: "inventory") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.603901 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.604256 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a552adeb-5834-4cfe-8ee3-56472dda5cab" (UID: "a552adeb-5834-4cfe-8ee3-56472dda5cab"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.679975 5094 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680006 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680016 5094 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680024 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680033 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5bl2\" (UniqueName: \"kubernetes.io/projected/a552adeb-5834-4cfe-8ee3-56472dda5cab-kube-api-access-f5bl2\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:00 crc kubenswrapper[5094]: I0220 09:21:00.680041 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a552adeb-5834-4cfe-8ee3-56472dda5cab-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.038971 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" event={"ID":"a552adeb-5834-4cfe-8ee3-56472dda5cab","Type":"ContainerDied","Data":"27d38e30a80a5d08e54616c5028c63d34bbc4b444a5bebcc5405786abf7e2ca0"} Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.039004 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-f6rmf" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.039009 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27d38e30a80a5d08e54616c5028c63d34bbc4b444a5bebcc5405786abf7e2ca0" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.209834 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-tclm2"] Feb 20 09:21:01 crc kubenswrapper[5094]: E0220 09:21:01.210920 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="extract-utilities" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.210946 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="extract-utilities" Feb 20 09:21:01 crc kubenswrapper[5094]: E0220 09:21:01.210976 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="extract-content" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.210983 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="extract-content" Feb 20 09:21:01 crc kubenswrapper[5094]: E0220 09:21:01.211000 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a552adeb-5834-4cfe-8ee3-56472dda5cab" containerName="libvirt-openstack-openstack-cell1" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.211006 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a552adeb-5834-4cfe-8ee3-56472dda5cab" containerName="libvirt-openstack-openstack-cell1" Feb 20 09:21:01 crc kubenswrapper[5094]: E0220 09:21:01.211016 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="registry-server" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.211023 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="registry-server" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.211478 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="811e4fa6-3b96-40b7-88a3-067b582b0683" containerName="registry-server" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.211521 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a552adeb-5834-4cfe-8ee3-56472dda5cab" containerName="libvirt-openstack-openstack-cell1" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.213673 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.225276 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.225628 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.225682 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.227095 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.227626 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.229093 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.242999 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.245218 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-tclm2"] Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395571 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395615 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395662 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395682 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395763 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.395975 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396024 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396094 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396135 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396191 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396229 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396335 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.396376 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498482 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498540 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498616 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498652 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498733 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.498768 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499414 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499446 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499456 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499523 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499656 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499697 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.499760 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.500002 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.503155 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.503616 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.504251 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.504372 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.504457 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.505479 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.506884 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.508134 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.510254 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.511920 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.519672 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") pod \"nova-cell1-openstack-openstack-cell1-tclm2\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:01 crc kubenswrapper[5094]: I0220 09:21:01.551262 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:21:02 crc kubenswrapper[5094]: I0220 09:21:02.156943 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:21:02 crc kubenswrapper[5094]: I0220 09:21:02.159308 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-tclm2"] Feb 20 09:21:03 crc kubenswrapper[5094]: I0220 09:21:03.061779 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" event={"ID":"086935dd-74d5-4657-a6a1-25bd11f6455f","Type":"ContainerStarted","Data":"c623c6bcb73425cb5942caee11d7bb36d9dab38dcb8e5a059b4e2fee9f335a16"} Feb 20 09:21:03 crc kubenswrapper[5094]: I0220 09:21:03.062417 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" event={"ID":"086935dd-74d5-4657-a6a1-25bd11f6455f","Type":"ContainerStarted","Data":"b5f00a6875a9dca730468d3130ccbd421639490aee72bce64c7d96373c50cba6"} Feb 20 09:21:03 crc kubenswrapper[5094]: I0220 09:21:03.087855 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" podStartSLOduration=1.60755362 podStartE2EDuration="2.087827533s" podCreationTimestamp="2026-02-20 09:21:01 +0000 UTC" firstStartedPulling="2026-02-20 09:21:02.156751845 +0000 UTC m=+9277.029378556" lastFinishedPulling="2026-02-20 09:21:02.637025758 +0000 UTC m=+9277.509652469" observedRunningTime="2026-02-20 09:21:03.080884756 +0000 UTC m=+9277.953511477" watchObservedRunningTime="2026-02-20 09:21:03.087827533 +0000 UTC m=+9277.960454254" Feb 20 09:21:08 crc kubenswrapper[5094]: I0220 09:21:08.840421 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:21:08 crc kubenswrapper[5094]: E0220 09:21:08.841147 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:21:22 crc kubenswrapper[5094]: I0220 09:21:22.840232 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:21:22 crc kubenswrapper[5094]: E0220 09:21:22.840886 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:21:28 crc kubenswrapper[5094]: I0220 09:21:28.968880 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:28 crc kubenswrapper[5094]: I0220 09:21:28.971264 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.066868 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.067208 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.067318 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.115877 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.161768 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.164345 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.169410 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.169494 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.169591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.170185 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.170761 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.182025 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.244883 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") pod \"redhat-marketplace-snghl\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.271880 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.271943 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.272362 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.294321 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.374250 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.374421 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.374466 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.375163 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.375460 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.399716 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") pod \"redhat-operators-wkskm\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.484238 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:29 crc kubenswrapper[5094]: I0220 09:21:29.838998 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.038200 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:30 crc kubenswrapper[5094]: W0220 09:21:30.051508 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac1cb6da_bf56_4dc5_9b34_251a55e75ba6.slice/crio-5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50 WatchSource:0}: Error finding container 5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50: Status 404 returned error can't find the container with id 5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50 Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.333420 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerID="a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648" exitCode=0 Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.333486 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerDied","Data":"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648"} Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.333552 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerStarted","Data":"3432a63a3e35886fc691c3469a2a6ea3d3dadcd62ce5fa72c15aaf723fb7dc0c"} Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.335037 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerID="dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19" exitCode=0 Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.335063 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerDied","Data":"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19"} Feb 20 09:21:30 crc kubenswrapper[5094]: I0220 09:21:30.335079 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerStarted","Data":"5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50"} Feb 20 09:21:32 crc kubenswrapper[5094]: I0220 09:21:32.361041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerStarted","Data":"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6"} Feb 20 09:21:32 crc kubenswrapper[5094]: I0220 09:21:32.366987 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerID="c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a" exitCode=0 Feb 20 09:21:32 crc kubenswrapper[5094]: I0220 09:21:32.367039 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerDied","Data":"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a"} Feb 20 09:21:33 crc kubenswrapper[5094]: I0220 09:21:33.380000 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerStarted","Data":"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9"} Feb 20 09:21:33 crc kubenswrapper[5094]: I0220 09:21:33.403867 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-snghl" podStartSLOduration=2.937889493 podStartE2EDuration="5.403837489s" podCreationTimestamp="2026-02-20 09:21:28 +0000 UTC" firstStartedPulling="2026-02-20 09:21:30.334914397 +0000 UTC m=+9305.207541108" lastFinishedPulling="2026-02-20 09:21:32.800862393 +0000 UTC m=+9307.673489104" observedRunningTime="2026-02-20 09:21:33.398510191 +0000 UTC m=+9308.271136902" watchObservedRunningTime="2026-02-20 09:21:33.403837489 +0000 UTC m=+9308.276464230" Feb 20 09:21:35 crc kubenswrapper[5094]: E0220 09:21:35.189957 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac1cb6da_bf56_4dc5_9b34_251a55e75ba6.slice/crio-96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6.scope\": RecentStats: unable to find data in memory cache]" Feb 20 09:21:35 crc kubenswrapper[5094]: I0220 09:21:35.400170 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerID="96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6" exitCode=0 Feb 20 09:21:35 crc kubenswrapper[5094]: I0220 09:21:35.400236 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerDied","Data":"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6"} Feb 20 09:21:35 crc kubenswrapper[5094]: I0220 09:21:35.847201 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:21:35 crc kubenswrapper[5094]: E0220 09:21:35.847583 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:21:36 crc kubenswrapper[5094]: I0220 09:21:36.411623 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerStarted","Data":"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7"} Feb 20 09:21:36 crc kubenswrapper[5094]: I0220 09:21:36.441679 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wkskm" podStartSLOduration=2.013292667 podStartE2EDuration="7.441657283s" podCreationTimestamp="2026-02-20 09:21:29 +0000 UTC" firstStartedPulling="2026-02-20 09:21:30.336895694 +0000 UTC m=+9305.209522405" lastFinishedPulling="2026-02-20 09:21:35.76526031 +0000 UTC m=+9310.637887021" observedRunningTime="2026-02-20 09:21:36.430523665 +0000 UTC m=+9311.303150366" watchObservedRunningTime="2026-02-20 09:21:36.441657283 +0000 UTC m=+9311.314283994" Feb 20 09:21:39 crc kubenswrapper[5094]: I0220 09:21:39.295924 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:39 crc kubenswrapper[5094]: I0220 09:21:39.296522 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:39 crc kubenswrapper[5094]: I0220 09:21:39.485420 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:39 crc kubenswrapper[5094]: I0220 09:21:39.485736 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:40 crc kubenswrapper[5094]: I0220 09:21:40.357006 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-snghl" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" probeResult="failure" output=< Feb 20 09:21:40 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:21:40 crc kubenswrapper[5094]: > Feb 20 09:21:40 crc kubenswrapper[5094]: I0220 09:21:40.533611 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wkskm" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" probeResult="failure" output=< Feb 20 09:21:40 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:21:40 crc kubenswrapper[5094]: > Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.358122 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.793145 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.815653 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.860760 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:49 crc kubenswrapper[5094]: I0220 09:21:49.870241 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:50 crc kubenswrapper[5094]: I0220 09:21:50.542436 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-snghl" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" containerID="cri-o://d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" gracePeriod=2 Feb 20 09:21:50 crc kubenswrapper[5094]: I0220 09:21:50.840847 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:21:50 crc kubenswrapper[5094]: E0220 09:21:50.841509 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.103361 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.179899 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") pod \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.179975 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") pod \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.180046 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") pod \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\" (UID: \"40cbd310-80bd-43a5-aa9f-d151d8397a5e\") " Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.180617 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities" (OuterVolumeSpecName: "utilities") pod "40cbd310-80bd-43a5-aa9f-d151d8397a5e" (UID: "40cbd310-80bd-43a5-aa9f-d151d8397a5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.180795 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.212974 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4" (OuterVolumeSpecName: "kube-api-access-tzlr4") pod "40cbd310-80bd-43a5-aa9f-d151d8397a5e" (UID: "40cbd310-80bd-43a5-aa9f-d151d8397a5e"). InnerVolumeSpecName "kube-api-access-tzlr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.248228 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40cbd310-80bd-43a5-aa9f-d151d8397a5e" (UID: "40cbd310-80bd-43a5-aa9f-d151d8397a5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.282899 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40cbd310-80bd-43a5-aa9f-d151d8397a5e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.282933 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzlr4\" (UniqueName: \"kubernetes.io/projected/40cbd310-80bd-43a5-aa9f-d151d8397a5e-kube-api-access-tzlr4\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553427 5094 generic.go:334] "Generic (PLEG): container finished" podID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerID="d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" exitCode=0 Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553467 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerDied","Data":"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9"} Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-snghl" event={"ID":"40cbd310-80bd-43a5-aa9f-d151d8397a5e","Type":"ContainerDied","Data":"3432a63a3e35886fc691c3469a2a6ea3d3dadcd62ce5fa72c15aaf723fb7dc0c"} Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553511 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-snghl" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.553520 5094 scope.go:117] "RemoveContainer" containerID="d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.587529 5094 scope.go:117] "RemoveContainer" containerID="c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.588463 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.598467 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-snghl"] Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.627668 5094 scope.go:117] "RemoveContainer" containerID="a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.671904 5094 scope.go:117] "RemoveContainer" containerID="d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" Feb 20 09:21:51 crc kubenswrapper[5094]: E0220 09:21:51.672376 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9\": container with ID starting with d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9 not found: ID does not exist" containerID="d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.672427 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9"} err="failed to get container status \"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9\": rpc error: code = NotFound desc = could not find container \"d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9\": container with ID starting with d16910822a977d3ffed417afd8c88c7e707411f2de2e107c93137b8c882274a9 not found: ID does not exist" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.672456 5094 scope.go:117] "RemoveContainer" containerID="c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a" Feb 20 09:21:51 crc kubenswrapper[5094]: E0220 09:21:51.672845 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a\": container with ID starting with c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a not found: ID does not exist" containerID="c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.672890 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a"} err="failed to get container status \"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a\": rpc error: code = NotFound desc = could not find container \"c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a\": container with ID starting with c6489d400015ce16b5bb2780f5bdb9ae8c2c1d96cf37f7a97d8b191553f2789a not found: ID does not exist" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.672918 5094 scope.go:117] "RemoveContainer" containerID="a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648" Feb 20 09:21:51 crc kubenswrapper[5094]: E0220 09:21:51.673332 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648\": container with ID starting with a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648 not found: ID does not exist" containerID="a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.673369 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648"} err="failed to get container status \"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648\": rpc error: code = NotFound desc = could not find container \"a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648\": container with ID starting with a7b657d582bfe99be21953db3aba964d80e38b73ae4bb0e5a435e16fac6c1648 not found: ID does not exist" Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.821831 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.822611 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wkskm" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" containerID="cri-o://aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" gracePeriod=2 Feb 20 09:21:51 crc kubenswrapper[5094]: I0220 09:21:51.860213 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" path="/var/lib/kubelet/pods/40cbd310-80bd-43a5-aa9f-d151d8397a5e/volumes" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.381566 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.514104 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") pod \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.514667 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") pod \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.514797 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") pod \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\" (UID: \"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6\") " Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.515687 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities" (OuterVolumeSpecName: "utilities") pod "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" (UID: "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.521733 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j" (OuterVolumeSpecName: "kube-api-access-hhz4j") pod "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" (UID: "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6"). InnerVolumeSpecName "kube-api-access-hhz4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.568463 5094 generic.go:334] "Generic (PLEG): container finished" podID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerID="aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" exitCode=0 Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.568557 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wkskm" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.570211 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerDied","Data":"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7"} Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.570355 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wkskm" event={"ID":"ac1cb6da-bf56-4dc5-9b34-251a55e75ba6","Type":"ContainerDied","Data":"5cd95cedd36a9fe6d492781127606e3898935339a83730f5e5e72eecbd818b50"} Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.570427 5094 scope.go:117] "RemoveContainer" containerID="aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.601862 5094 scope.go:117] "RemoveContainer" containerID="96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.623312 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhz4j\" (UniqueName: \"kubernetes.io/projected/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-kube-api-access-hhz4j\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.623529 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.636359 5094 scope.go:117] "RemoveContainer" containerID="dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.658525 5094 scope.go:117] "RemoveContainer" containerID="aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" Feb 20 09:21:52 crc kubenswrapper[5094]: E0220 09:21:52.659288 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7\": container with ID starting with aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7 not found: ID does not exist" containerID="aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.659397 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7"} err="failed to get container status \"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7\": rpc error: code = NotFound desc = could not find container \"aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7\": container with ID starting with aefab98f9be28e8f6ee1dc784e3de109cead402994b0f4902476f1b1e66001d7 not found: ID does not exist" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.659493 5094 scope.go:117] "RemoveContainer" containerID="96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6" Feb 20 09:21:52 crc kubenswrapper[5094]: E0220 09:21:52.659856 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6\": container with ID starting with 96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6 not found: ID does not exist" containerID="96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.659975 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6"} err="failed to get container status \"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6\": rpc error: code = NotFound desc = could not find container \"96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6\": container with ID starting with 96bf390a5ccfccc71cd37fc6fdff090f60306c6fdb4810bd84e7c7db47a65ef6 not found: ID does not exist" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.660058 5094 scope.go:117] "RemoveContainer" containerID="dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19" Feb 20 09:21:52 crc kubenswrapper[5094]: E0220 09:21:52.660398 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19\": container with ID starting with dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19 not found: ID does not exist" containerID="dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.660497 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19"} err="failed to get container status \"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19\": rpc error: code = NotFound desc = could not find container \"dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19\": container with ID starting with dd297ae58f3d57c6457856bfef3df4e9674a1df5468a15a1a134d0c82e3c6d19 not found: ID does not exist" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.676925 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" (UID: "ac1cb6da-bf56-4dc5-9b34-251a55e75ba6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.726009 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.898139 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:52 crc kubenswrapper[5094]: I0220 09:21:52.908499 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wkskm"] Feb 20 09:21:53 crc kubenswrapper[5094]: I0220 09:21:53.854933 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" path="/var/lib/kubelet/pods/ac1cb6da-bf56-4dc5-9b34-251a55e75ba6/volumes" Feb 20 09:22:04 crc kubenswrapper[5094]: I0220 09:22:04.840918 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:04 crc kubenswrapper[5094]: E0220 09:22:04.841776 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:22:17 crc kubenswrapper[5094]: I0220 09:22:17.842232 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:17 crc kubenswrapper[5094]: E0220 09:22:17.845826 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:22:31 crc kubenswrapper[5094]: I0220 09:22:31.840334 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:31 crc kubenswrapper[5094]: E0220 09:22:31.841071 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:22:45 crc kubenswrapper[5094]: I0220 09:22:45.849382 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:45 crc kubenswrapper[5094]: E0220 09:22:45.850285 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:22:59 crc kubenswrapper[5094]: I0220 09:22:59.839774 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:22:59 crc kubenswrapper[5094]: E0220 09:22:59.840555 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:23:14 crc kubenswrapper[5094]: I0220 09:23:14.841251 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:23:15 crc kubenswrapper[5094]: I0220 09:23:15.607892 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f"} Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.485900 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487122 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="extract-content" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487140 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="extract-content" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487180 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="extract-utilities" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487190 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="extract-utilities" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487208 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487220 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487234 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="extract-utilities" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487242 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="extract-utilities" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487258 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487266 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: E0220 09:23:20.487278 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="extract-content" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487286 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="extract-content" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487589 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="40cbd310-80bd-43a5-aa9f-d151d8397a5e" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.487612 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac1cb6da-bf56-4dc5-9b34-251a55e75ba6" containerName="registry-server" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.489484 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.509060 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.585422 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.586401 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.586510 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.687361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.687424 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.687534 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.687985 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.688475 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.710673 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") pod \"certified-operators-scdqt\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:20 crc kubenswrapper[5094]: I0220 09:23:20.825167 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:21 crc kubenswrapper[5094]: I0220 09:23:21.327301 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:21 crc kubenswrapper[5094]: I0220 09:23:21.683612 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerStarted","Data":"4b359f1a3fa7bb1a2550a1bda8b29ed69dfc413e256d39e62079e81f061afc15"} Feb 20 09:23:22 crc kubenswrapper[5094]: I0220 09:23:22.695953 5094 generic.go:334] "Generic (PLEG): container finished" podID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerID="4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7" exitCode=0 Feb 20 09:23:22 crc kubenswrapper[5094]: I0220 09:23:22.696094 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerDied","Data":"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7"} Feb 20 09:23:24 crc kubenswrapper[5094]: I0220 09:23:24.720999 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerStarted","Data":"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873"} Feb 20 09:23:25 crc kubenswrapper[5094]: I0220 09:23:25.731956 5094 generic.go:334] "Generic (PLEG): container finished" podID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerID="cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873" exitCode=0 Feb 20 09:23:25 crc kubenswrapper[5094]: I0220 09:23:25.732062 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerDied","Data":"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873"} Feb 20 09:23:26 crc kubenswrapper[5094]: I0220 09:23:26.743285 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerStarted","Data":"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b"} Feb 20 09:23:26 crc kubenswrapper[5094]: I0220 09:23:26.763810 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-scdqt" podStartSLOduration=3.357818182 podStartE2EDuration="6.763792604s" podCreationTimestamp="2026-02-20 09:23:20 +0000 UTC" firstStartedPulling="2026-02-20 09:23:22.698142591 +0000 UTC m=+9417.570769302" lastFinishedPulling="2026-02-20 09:23:26.104117013 +0000 UTC m=+9420.976743724" observedRunningTime="2026-02-20 09:23:26.76114785 +0000 UTC m=+9421.633774651" watchObservedRunningTime="2026-02-20 09:23:26.763792604 +0000 UTC m=+9421.636419315" Feb 20 09:23:30 crc kubenswrapper[5094]: I0220 09:23:30.825742 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:30 crc kubenswrapper[5094]: I0220 09:23:30.826158 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:31 crc kubenswrapper[5094]: I0220 09:23:31.331320 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:31 crc kubenswrapper[5094]: I0220 09:23:31.852171 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:31 crc kubenswrapper[5094]: I0220 09:23:31.902449 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:33 crc kubenswrapper[5094]: I0220 09:23:33.803131 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-scdqt" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="registry-server" containerID="cri-o://c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" gracePeriod=2 Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.264893 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.272488 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") pod \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.272555 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") pod \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.272687 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") pod \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\" (UID: \"f05d18a6-4c8d-4876-ae31-b44332ff55ca\") " Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.274496 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities" (OuterVolumeSpecName: "utilities") pod "f05d18a6-4c8d-4876-ae31-b44332ff55ca" (UID: "f05d18a6-4c8d-4876-ae31-b44332ff55ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.338624 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f05d18a6-4c8d-4876-ae31-b44332ff55ca" (UID: "f05d18a6-4c8d-4876-ae31-b44332ff55ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.361979 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f" (OuterVolumeSpecName: "kube-api-access-tds5f") pod "f05d18a6-4c8d-4876-ae31-b44332ff55ca" (UID: "f05d18a6-4c8d-4876-ae31-b44332ff55ca"). InnerVolumeSpecName "kube-api-access-tds5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.375559 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.375596 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f05d18a6-4c8d-4876-ae31-b44332ff55ca-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.375607 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tds5f\" (UniqueName: \"kubernetes.io/projected/f05d18a6-4c8d-4876-ae31-b44332ff55ca-kube-api-access-tds5f\") on node \"crc\" DevicePath \"\"" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.815614 5094 generic.go:334] "Generic (PLEG): container finished" podID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerID="c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" exitCode=0 Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.815770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerDied","Data":"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b"} Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.815857 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-scdqt" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.816029 5094 scope.go:117] "RemoveContainer" containerID="c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.816010 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-scdqt" event={"ID":"f05d18a6-4c8d-4876-ae31-b44332ff55ca","Type":"ContainerDied","Data":"4b359f1a3fa7bb1a2550a1bda8b29ed69dfc413e256d39e62079e81f061afc15"} Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.850097 5094 scope.go:117] "RemoveContainer" containerID="cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.871828 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.872248 5094 scope.go:117] "RemoveContainer" containerID="4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.881101 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-scdqt"] Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.920576 5094 scope.go:117] "RemoveContainer" containerID="c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" Feb 20 09:23:34 crc kubenswrapper[5094]: E0220 09:23:34.921078 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b\": container with ID starting with c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b not found: ID does not exist" containerID="c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921119 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b"} err="failed to get container status \"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b\": rpc error: code = NotFound desc = could not find container \"c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b\": container with ID starting with c5bf28e7fd944e53ffd78146956ea3c8c30d4b1fe18bd58a21e41551769c7d0b not found: ID does not exist" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921145 5094 scope.go:117] "RemoveContainer" containerID="cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873" Feb 20 09:23:34 crc kubenswrapper[5094]: E0220 09:23:34.921558 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873\": container with ID starting with cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873 not found: ID does not exist" containerID="cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921586 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873"} err="failed to get container status \"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873\": rpc error: code = NotFound desc = could not find container \"cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873\": container with ID starting with cec40255b9c8bf55aa9d7e637e649a16a0eb5d70c662743f237ead50d855b873 not found: ID does not exist" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921603 5094 scope.go:117] "RemoveContainer" containerID="4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7" Feb 20 09:23:34 crc kubenswrapper[5094]: E0220 09:23:34.921925 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7\": container with ID starting with 4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7 not found: ID does not exist" containerID="4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7" Feb 20 09:23:34 crc kubenswrapper[5094]: I0220 09:23:34.921954 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7"} err="failed to get container status \"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7\": rpc error: code = NotFound desc = could not find container \"4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7\": container with ID starting with 4b943ec4c7d1ca434acd9492654334ba91ffe6700bc8e7d3d8b8214771d14fd7 not found: ID does not exist" Feb 20 09:23:35 crc kubenswrapper[5094]: I0220 09:23:35.857848 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" path="/var/lib/kubelet/pods/f05d18a6-4c8d-4876-ae31-b44332ff55ca/volumes" Feb 20 09:24:13 crc kubenswrapper[5094]: I0220 09:24:13.208651 5094 generic.go:334] "Generic (PLEG): container finished" podID="086935dd-74d5-4657-a6a1-25bd11f6455f" containerID="c623c6bcb73425cb5942caee11d7bb36d9dab38dcb8e5a059b4e2fee9f335a16" exitCode=0 Feb 20 09:24:13 crc kubenswrapper[5094]: I0220 09:24:13.208738 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" event={"ID":"086935dd-74d5-4657-a6a1-25bd11f6455f","Type":"ContainerDied","Data":"c623c6bcb73425cb5942caee11d7bb36d9dab38dcb8e5a059b4e2fee9f335a16"} Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.745880 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864025 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864091 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864168 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864197 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864246 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864302 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864348 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864390 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864423 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864465 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864517 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864587 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.864622 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") pod \"086935dd-74d5-4657-a6a1-25bd11f6455f\" (UID: \"086935dd-74d5-4657-a6a1-25bd11f6455f\") " Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.871208 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv" (OuterVolumeSpecName: "kube-api-access-pjltv") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "kube-api-access-pjltv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.877226 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.877355 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph" (OuterVolumeSpecName: "ceph") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.894403 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.898888 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.899311 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.899345 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.903092 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory" (OuterVolumeSpecName: "inventory") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.918053 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.921014 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.921046 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.933283 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.944664 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "086935dd-74d5-4657-a6a1-25bd11f6455f" (UID: "086935dd-74d5-4657-a6a1-25bd11f6455f"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967579 5094 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967611 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967624 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967635 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjltv\" (UniqueName: \"kubernetes.io/projected/086935dd-74d5-4657-a6a1-25bd11f6455f-kube-api-access-pjltv\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967643 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967654 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967663 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967671 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967679 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967688 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967696 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967718 5094 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:14 crc kubenswrapper[5094]: I0220 09:24:14.967728 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/086935dd-74d5-4657-a6a1-25bd11f6455f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.237114 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" event={"ID":"086935dd-74d5-4657-a6a1-25bd11f6455f","Type":"ContainerDied","Data":"b5f00a6875a9dca730468d3130ccbd421639490aee72bce64c7d96373c50cba6"} Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.237160 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5f00a6875a9dca730468d3130ccbd421639490aee72bce64c7d96373c50cba6" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.237180 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-tclm2" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.420993 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-s5mln"] Feb 20 09:24:15 crc kubenswrapper[5094]: E0220 09:24:15.421532 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="extract-utilities" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421557 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="extract-utilities" Feb 20 09:24:15 crc kubenswrapper[5094]: E0220 09:24:15.421576 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="registry-server" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421586 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="registry-server" Feb 20 09:24:15 crc kubenswrapper[5094]: E0220 09:24:15.421603 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="extract-content" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421611 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="extract-content" Feb 20 09:24:15 crc kubenswrapper[5094]: E0220 09:24:15.421621 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086935dd-74d5-4657-a6a1-25bd11f6455f" containerName="nova-cell1-openstack-openstack-cell1" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421631 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="086935dd-74d5-4657-a6a1-25bd11f6455f" containerName="nova-cell1-openstack-openstack-cell1" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421908 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="086935dd-74d5-4657-a6a1-25bd11f6455f" containerName="nova-cell1-openstack-openstack-cell1" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.421932 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05d18a6-4c8d-4876-ae31-b44332ff55ca" containerName="registry-server" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.422861 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.424992 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.425450 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.425469 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.425474 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.425614 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.435658 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-s5mln"] Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579089 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579159 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579178 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579293 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579313 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579438 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579464 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.579510 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681791 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681836 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681886 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681927 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681958 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.681977 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.682011 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:15 crc kubenswrapper[5094]: I0220 09:24:15.682037 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.144145 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.144669 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.144779 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.145175 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.145212 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.145350 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.146523 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.146602 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") pod \"telemetry-openstack-openstack-cell1-s5mln\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.343038 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:24:16 crc kubenswrapper[5094]: I0220 09:24:16.940439 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-s5mln"] Feb 20 09:24:17 crc kubenswrapper[5094]: I0220 09:24:17.273218 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" event={"ID":"a134a8f4-8450-4f7c-9988-11686cbdcd19","Type":"ContainerStarted","Data":"6c7d4214b5a698225ea8951f58de0f8b3b887bc05034f126dfe8b75d2e437a91"} Feb 20 09:24:19 crc kubenswrapper[5094]: I0220 09:24:19.298123 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" event={"ID":"a134a8f4-8450-4f7c-9988-11686cbdcd19","Type":"ContainerStarted","Data":"73d8bcbbcd28ef40d7b3c365d7fd939eb7c6d42e40f91b1591867adf8ce6addb"} Feb 20 09:25:34 crc kubenswrapper[5094]: I0220 09:25:34.107511 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:25:34 crc kubenswrapper[5094]: I0220 09:25:34.108278 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:26:04 crc kubenswrapper[5094]: I0220 09:26:04.106611 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:26:04 crc kubenswrapper[5094]: I0220 09:26:04.107244 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.106623 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.107473 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.107543 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.108698 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.108836 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f" gracePeriod=600 Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.771789 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f" exitCode=0 Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.772102 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f"} Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.772138 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4"} Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.772159 5094 scope.go:117] "RemoveContainer" containerID="000258bf71f81584519d0987a17d60be81dab19ecb342678f4d5b86ff80ae0a0" Feb 20 09:26:34 crc kubenswrapper[5094]: I0220 09:26:34.807364 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" podStartSLOduration=139.413504418 podStartE2EDuration="2m19.807343383s" podCreationTimestamp="2026-02-20 09:24:15 +0000 UTC" firstStartedPulling="2026-02-20 09:24:16.948214835 +0000 UTC m=+9471.820841536" lastFinishedPulling="2026-02-20 09:24:17.34205379 +0000 UTC m=+9472.214680501" observedRunningTime="2026-02-20 09:24:19.325233692 +0000 UTC m=+9474.197860473" watchObservedRunningTime="2026-02-20 09:26:34.807343383 +0000 UTC m=+9609.679970104" Feb 20 09:28:34 crc kubenswrapper[5094]: I0220 09:28:34.116616 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:28:34 crc kubenswrapper[5094]: I0220 09:28:34.117349 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:29:04 crc kubenswrapper[5094]: I0220 09:29:04.107194 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:29:04 crc kubenswrapper[5094]: I0220 09:29:04.107984 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.106905 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.107484 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.107538 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.108577 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.108647 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" gracePeriod=600 Feb 20 09:29:34 crc kubenswrapper[5094]: E0220 09:29:34.251376 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.794081 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" exitCode=0 Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.794115 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4"} Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.794189 5094 scope.go:117] "RemoveContainer" containerID="94b92f3a2bced5f38491238ad47cd4799f0c32866995d1804ffd21cbf89a306f" Feb 20 09:29:34 crc kubenswrapper[5094]: I0220 09:29:34.795043 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:29:34 crc kubenswrapper[5094]: E0220 09:29:34.795481 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:29:49 crc kubenswrapper[5094]: I0220 09:29:49.841081 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:29:49 crc kubenswrapper[5094]: E0220 09:29:49.842129 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.178317 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.180998 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.183416 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.184414 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.192586 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.307452 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.307603 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.307659 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.409824 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.409986 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.410029 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.411185 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.417572 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.428948 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") pod \"collect-profiles-29526330-4dpg7\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.513752 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:00 crc kubenswrapper[5094]: I0220 09:30:00.968747 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 09:30:01 crc kubenswrapper[5094]: I0220 09:30:01.105118 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" event={"ID":"06092367-1969-4b35-8025-09e5a52a5855","Type":"ContainerStarted","Data":"bb2abcf7fb1444dc481ad19c1b42c93bf38073d87f41fb8c968098ad31011d6d"} Feb 20 09:30:01 crc kubenswrapper[5094]: I0220 09:30:01.109067 5094 generic.go:334] "Generic (PLEG): container finished" podID="a134a8f4-8450-4f7c-9988-11686cbdcd19" containerID="73d8bcbbcd28ef40d7b3c365d7fd939eb7c6d42e40f91b1591867adf8ce6addb" exitCode=0 Feb 20 09:30:01 crc kubenswrapper[5094]: I0220 09:30:01.109151 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" event={"ID":"a134a8f4-8450-4f7c-9988-11686cbdcd19","Type":"ContainerDied","Data":"73d8bcbbcd28ef40d7b3c365d7fd939eb7c6d42e40f91b1591867adf8ce6addb"} Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.121012 5094 generic.go:334] "Generic (PLEG): container finished" podID="06092367-1969-4b35-8025-09e5a52a5855" containerID="4458d3e89efbd0e5ea42a99c4b47f135cba67a66cbdaaf49efb55576b8dd1322" exitCode=0 Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.121067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" event={"ID":"06092367-1969-4b35-8025-09e5a52a5855","Type":"ContainerDied","Data":"4458d3e89efbd0e5ea42a99c4b47f135cba67a66cbdaaf49efb55576b8dd1322"} Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.748913 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.751614 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.768828 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.794673 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.840843 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:02 crc kubenswrapper[5094]: E0220 09:30:02.841615 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.885787 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.885911 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.885957 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.885984 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.886083 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.886116 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.886143 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.886191 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") pod \"a134a8f4-8450-4f7c-9988-11686cbdcd19\" (UID: \"a134a8f4-8450-4f7c-9988-11686cbdcd19\") " Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.887563 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.887638 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.888058 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.906759 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph" (OuterVolumeSpecName: "ceph") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.909054 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.922110 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq" (OuterVolumeSpecName: "kube-api-access-lnfsq") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "kube-api-access-lnfsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.925143 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.925933 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.925983 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory" (OuterVolumeSpecName: "inventory") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.945816 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.948389 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a134a8f4-8450-4f7c-9988-11686cbdcd19" (UID: "a134a8f4-8450-4f7c-9988-11686cbdcd19"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.989676 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.989921 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.989969 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990100 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990126 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990139 5094 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990153 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990163 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990176 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990189 5094 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a134a8f4-8450-4f7c-9988-11686cbdcd19-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990202 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnfsq\" (UniqueName: \"kubernetes.io/projected/a134a8f4-8450-4f7c-9988-11686cbdcd19-kube-api-access-lnfsq\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.990667 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:02 crc kubenswrapper[5094]: I0220 09:30:02.991114 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.009337 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") pod \"community-operators-9tfpc\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.109038 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.133147 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" event={"ID":"a134a8f4-8450-4f7c-9988-11686cbdcd19","Type":"ContainerDied","Data":"6c7d4214b5a698225ea8951f58de0f8b3b887bc05034f126dfe8b75d2e437a91"} Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.133235 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7d4214b5a698225ea8951f58de0f8b3b887bc05034f126dfe8b75d2e437a91" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.133171 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-s5mln" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.248940 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-6sqzq"] Feb 20 09:30:03 crc kubenswrapper[5094]: E0220 09:30:03.250336 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a134a8f4-8450-4f7c-9988-11686cbdcd19" containerName="telemetry-openstack-openstack-cell1" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.250494 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a134a8f4-8450-4f7c-9988-11686cbdcd19" containerName="telemetry-openstack-openstack-cell1" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.251093 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a134a8f4-8450-4f7c-9988-11686cbdcd19" containerName="telemetry-openstack-openstack-cell1" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.252152 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.256326 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.257724 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.257973 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.260175 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.260391 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.274615 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-6sqzq"] Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.298886 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.298976 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.298998 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.299021 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.299084 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.299105 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.405901 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.405946 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.406033 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.406112 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.406131 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.406150 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.411818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.412230 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.413097 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.413277 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.416465 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.429495 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") pod \"neutron-sriov-openstack-openstack-cell1-6sqzq\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.569284 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.600361 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.710801 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") pod \"06092367-1969-4b35-8025-09e5a52a5855\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.710898 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") pod \"06092367-1969-4b35-8025-09e5a52a5855\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.711175 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") pod \"06092367-1969-4b35-8025-09e5a52a5855\" (UID: \"06092367-1969-4b35-8025-09e5a52a5855\") " Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.712011 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume" (OuterVolumeSpecName: "config-volume") pod "06092367-1969-4b35-8025-09e5a52a5855" (UID: "06092367-1969-4b35-8025-09e5a52a5855"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.715189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q" (OuterVolumeSpecName: "kube-api-access-dhh9q") pod "06092367-1969-4b35-8025-09e5a52a5855" (UID: "06092367-1969-4b35-8025-09e5a52a5855"). InnerVolumeSpecName "kube-api-access-dhh9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.715197 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06092367-1969-4b35-8025-09e5a52a5855" (UID: "06092367-1969-4b35-8025-09e5a52a5855"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.789541 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:03 crc kubenswrapper[5094]: W0220 09:30:03.800887 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f58baa6_934a_458f_be52_13c9185c7076.slice/crio-4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925 WatchSource:0}: Error finding container 4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925: Status 404 returned error can't find the container with id 4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925 Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.814527 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06092367-1969-4b35-8025-09e5a52a5855-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.814563 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhh9q\" (UniqueName: \"kubernetes.io/projected/06092367-1969-4b35-8025-09e5a52a5855-kube-api-access-dhh9q\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:03 crc kubenswrapper[5094]: I0220 09:30:03.814575 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06092367-1969-4b35-8025-09e5a52a5855-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.145928 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-6sqzq"] Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.157124 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.157263 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f58baa6-934a-458f-be52-13c9185c7076" containerID="609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a" exitCode=0 Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.157490 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerDied","Data":"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a"} Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.157524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerStarted","Data":"4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925"} Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.161509 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" event={"ID":"06092367-1969-4b35-8025-09e5a52a5855","Type":"ContainerDied","Data":"bb2abcf7fb1444dc481ad19c1b42c93bf38073d87f41fb8c968098ad31011d6d"} Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.161639 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb2abcf7fb1444dc481ad19c1b42c93bf38073d87f41fb8c968098ad31011d6d" Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.161795 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7" Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.676129 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 09:30:04 crc kubenswrapper[5094]: I0220 09:30:04.691643 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526285-265c4"] Feb 20 09:30:05 crc kubenswrapper[5094]: I0220 09:30:05.184310 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" event={"ID":"6d3bf727-1eae-408c-be3d-2df97b387704","Type":"ContainerStarted","Data":"17fbee9c4a3852e927701f4dfea9296493a9a027cb979ab0b3ffefe9fd18a14f"} Feb 20 09:30:05 crc kubenswrapper[5094]: I0220 09:30:05.184746 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" event={"ID":"6d3bf727-1eae-408c-be3d-2df97b387704","Type":"ContainerStarted","Data":"807e3a8640eb579cfbd8f2e00fbe291277d0bdb2b420fc4b5394ff2cab037a0a"} Feb 20 09:30:05 crc kubenswrapper[5094]: I0220 09:30:05.217237 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" podStartSLOduration=1.737112878 podStartE2EDuration="2.217216429s" podCreationTimestamp="2026-02-20 09:30:03 +0000 UTC" firstStartedPulling="2026-02-20 09:30:04.156676195 +0000 UTC m=+9819.029302916" lastFinishedPulling="2026-02-20 09:30:04.636779736 +0000 UTC m=+9819.509406467" observedRunningTime="2026-02-20 09:30:05.206232655 +0000 UTC m=+9820.078859366" watchObservedRunningTime="2026-02-20 09:30:05.217216429 +0000 UTC m=+9820.089843140" Feb 20 09:30:05 crc kubenswrapper[5094]: I0220 09:30:05.859976 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4271712d-7fb9-4862-bc38-e3cfbcced425" path="/var/lib/kubelet/pods/4271712d-7fb9-4862-bc38-e3cfbcced425/volumes" Feb 20 09:30:06 crc kubenswrapper[5094]: I0220 09:30:06.198225 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f58baa6-934a-458f-be52-13c9185c7076" containerID="bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b" exitCode=0 Feb 20 09:30:06 crc kubenswrapper[5094]: I0220 09:30:06.198326 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerDied","Data":"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b"} Feb 20 09:30:07 crc kubenswrapper[5094]: I0220 09:30:07.208858 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerStarted","Data":"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c"} Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.110171 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.110729 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.154779 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.181150 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9tfpc" podStartSLOduration=8.733881591 podStartE2EDuration="11.181106886s" podCreationTimestamp="2026-02-20 09:30:02 +0000 UTC" firstStartedPulling="2026-02-20 09:30:04.159420501 +0000 UTC m=+9819.032047222" lastFinishedPulling="2026-02-20 09:30:06.606645806 +0000 UTC m=+9821.479272517" observedRunningTime="2026-02-20 09:30:07.262978266 +0000 UTC m=+9822.135604977" watchObservedRunningTime="2026-02-20 09:30:13.181106886 +0000 UTC m=+9828.053733597" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.403896 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:13 crc kubenswrapper[5094]: I0220 09:30:13.466100 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:15 crc kubenswrapper[5094]: I0220 09:30:15.373008 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9tfpc" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="registry-server" containerID="cri-o://ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" gracePeriod=2 Feb 20 09:30:15 crc kubenswrapper[5094]: I0220 09:30:15.848686 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:15 crc kubenswrapper[5094]: E0220 09:30:15.849346 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:15 crc kubenswrapper[5094]: I0220 09:30:15.893289 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.029796 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") pod \"6f58baa6-934a-458f-be52-13c9185c7076\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.029880 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") pod \"6f58baa6-934a-458f-be52-13c9185c7076\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.029915 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") pod \"6f58baa6-934a-458f-be52-13c9185c7076\" (UID: \"6f58baa6-934a-458f-be52-13c9185c7076\") " Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.030841 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities" (OuterVolumeSpecName: "utilities") pod "6f58baa6-934a-458f-be52-13c9185c7076" (UID: "6f58baa6-934a-458f-be52-13c9185c7076"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.041445 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq" (OuterVolumeSpecName: "kube-api-access-swqtq") pod "6f58baa6-934a-458f-be52-13c9185c7076" (UID: "6f58baa6-934a-458f-be52-13c9185c7076"). InnerVolumeSpecName "kube-api-access-swqtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.095510 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f58baa6-934a-458f-be52-13c9185c7076" (UID: "6f58baa6-934a-458f-be52-13c9185c7076"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.132791 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.132828 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swqtq\" (UniqueName: \"kubernetes.io/projected/6f58baa6-934a-458f-be52-13c9185c7076-kube-api-access-swqtq\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.133026 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f58baa6-934a-458f-be52-13c9185c7076-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386434 5094 generic.go:334] "Generic (PLEG): container finished" podID="6f58baa6-934a-458f-be52-13c9185c7076" containerID="ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" exitCode=0 Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386478 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerDied","Data":"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c"} Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386491 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9tfpc" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386508 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9tfpc" event={"ID":"6f58baa6-934a-458f-be52-13c9185c7076","Type":"ContainerDied","Data":"4a536eaac1c56dabcf722017ffa827274fa5003477db0004d60a134631bc7925"} Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.386526 5094 scope.go:117] "RemoveContainer" containerID="ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.405869 5094 scope.go:117] "RemoveContainer" containerID="bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.426971 5094 scope.go:117] "RemoveContainer" containerID="609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.453013 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.471502 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9tfpc"] Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.483031 5094 scope.go:117] "RemoveContainer" containerID="ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" Feb 20 09:30:16 crc kubenswrapper[5094]: E0220 09:30:16.483666 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c\": container with ID starting with ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c not found: ID does not exist" containerID="ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.483711 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c"} err="failed to get container status \"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c\": rpc error: code = NotFound desc = could not find container \"ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c\": container with ID starting with ea288d0e36a6a61918ac1052fa176505630660bcda4dc25081d87d390c85c33c not found: ID does not exist" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.483757 5094 scope.go:117] "RemoveContainer" containerID="bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b" Feb 20 09:30:16 crc kubenswrapper[5094]: E0220 09:30:16.484150 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b\": container with ID starting with bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b not found: ID does not exist" containerID="bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.484211 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b"} err="failed to get container status \"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b\": rpc error: code = NotFound desc = could not find container \"bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b\": container with ID starting with bb435daf7ab6ba6af4984d8f667c7227c5ab824fd2097edb47d00e633cb1570b not found: ID does not exist" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.484241 5094 scope.go:117] "RemoveContainer" containerID="609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a" Feb 20 09:30:16 crc kubenswrapper[5094]: E0220 09:30:16.484540 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a\": container with ID starting with 609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a not found: ID does not exist" containerID="609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a" Feb 20 09:30:16 crc kubenswrapper[5094]: I0220 09:30:16.484572 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a"} err="failed to get container status \"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a\": rpc error: code = NotFound desc = could not find container \"609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a\": container with ID starting with 609bbebaaf348df0c34e989e5b432e9346c1eadbd013c5cdf0375a6968705b3a not found: ID does not exist" Feb 20 09:30:17 crc kubenswrapper[5094]: I0220 09:30:17.851032 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f58baa6-934a-458f-be52-13c9185c7076" path="/var/lib/kubelet/pods/6f58baa6-934a-458f-be52-13c9185c7076/volumes" Feb 20 09:30:30 crc kubenswrapper[5094]: I0220 09:30:30.840122 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:30 crc kubenswrapper[5094]: E0220 09:30:30.840738 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:43 crc kubenswrapper[5094]: I0220 09:30:43.841833 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:43 crc kubenswrapper[5094]: E0220 09:30:43.842667 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:44 crc kubenswrapper[5094]: I0220 09:30:44.538395 5094 scope.go:117] "RemoveContainer" containerID="8db4ee156703861a72d9f8f5a2380b086eb9a7f4aadd08037037563019ebbb48" Feb 20 09:30:52 crc kubenswrapper[5094]: I0220 09:30:52.740318 5094 generic.go:334] "Generic (PLEG): container finished" podID="6d3bf727-1eae-408c-be3d-2df97b387704" containerID="17fbee9c4a3852e927701f4dfea9296493a9a027cb979ab0b3ffefe9fd18a14f" exitCode=0 Feb 20 09:30:52 crc kubenswrapper[5094]: I0220 09:30:52.740392 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" event={"ID":"6d3bf727-1eae-408c-be3d-2df97b387704","Type":"ContainerDied","Data":"17fbee9c4a3852e927701f4dfea9296493a9a027cb979ab0b3ffefe9fd18a14f"} Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.363433 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.444777 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445179 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445408 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445479 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445516 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.445578 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.542834 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2" (OuterVolumeSpecName: "kube-api-access-9jpl2") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "kube-api-access-9jpl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.542927 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.543181 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph" (OuterVolumeSpecName: "ceph") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.545804 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory" (OuterVolumeSpecName: "inventory") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.547362 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.547726 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") pod \"6d3bf727-1eae-408c-be3d-2df97b387704\" (UID: \"6d3bf727-1eae-408c-be3d-2df97b387704\") " Feb 20 09:30:54 crc kubenswrapper[5094]: W0220 09:30:54.547765 5094 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/6d3bf727-1eae-408c-be3d-2df97b387704/volumes/kubernetes.io~secret/ssh-key-openstack-cell1 Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.547776 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548310 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548337 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548352 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548365 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.548377 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jpl2\" (UniqueName: \"kubernetes.io/projected/6d3bf727-1eae-408c-be3d-2df97b387704-kube-api-access-9jpl2\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.551816 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "6d3bf727-1eae-408c-be3d-2df97b387704" (UID: "6d3bf727-1eae-408c-be3d-2df97b387704"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.650445 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/6d3bf727-1eae-408c-be3d-2df97b387704-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.770958 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" event={"ID":"6d3bf727-1eae-408c-be3d-2df97b387704","Type":"ContainerDied","Data":"807e3a8640eb579cfbd8f2e00fbe291277d0bdb2b420fc4b5394ff2cab037a0a"} Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.770997 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="807e3a8640eb579cfbd8f2e00fbe291277d0bdb2b420fc4b5394ff2cab037a0a" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.771030 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-6sqzq" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887177 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9"] Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887795 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="extract-utilities" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887810 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="extract-utilities" Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887830 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="extract-content" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887839 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="extract-content" Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887858 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d3bf727-1eae-408c-be3d-2df97b387704" containerName="neutron-sriov-openstack-openstack-cell1" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887868 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d3bf727-1eae-408c-be3d-2df97b387704" containerName="neutron-sriov-openstack-openstack-cell1" Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887886 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="registry-server" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887894 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="registry-server" Feb 20 09:30:54 crc kubenswrapper[5094]: E0220 09:30:54.887913 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06092367-1969-4b35-8025-09e5a52a5855" containerName="collect-profiles" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.887922 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="06092367-1969-4b35-8025-09e5a52a5855" containerName="collect-profiles" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.888182 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f58baa6-934a-458f-be52-13c9185c7076" containerName="registry-server" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.888201 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d3bf727-1eae-408c-be3d-2df97b387704" containerName="neutron-sriov-openstack-openstack-cell1" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.888219 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="06092367-1969-4b35-8025-09e5a52a5855" containerName="collect-profiles" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.889116 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.893234 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.893471 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.893612 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.894580 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.895044 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.899508 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9"] Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.955895 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956046 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956083 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956210 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956325 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:54 crc kubenswrapper[5094]: I0220 09:30:54.956397 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.058902 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059024 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059061 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059120 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059190 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.059235 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.065276 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.065312 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.065308 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.065816 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.067440 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.079741 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") pod \"neutron-dhcp-openstack-openstack-cell1-g7hl9\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.247689 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.795011 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9"] Feb 20 09:30:55 crc kubenswrapper[5094]: I0220 09:30:55.853330 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:30:55 crc kubenswrapper[5094]: E0220 09:30:55.853649 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:30:56 crc kubenswrapper[5094]: I0220 09:30:56.792976 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" event={"ID":"f58790dc-4468-40ad-ba58-bb433a926abe","Type":"ContainerStarted","Data":"16bb3d0c2b07a856a2200802f3afc3601693f102f136e4654b9e478e3c583b2c"} Feb 20 09:30:56 crc kubenswrapper[5094]: I0220 09:30:56.793321 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" event={"ID":"f58790dc-4468-40ad-ba58-bb433a926abe","Type":"ContainerStarted","Data":"e39920cb7c5a699ca2b8e232842e2e36ead718dc4047ce9c0e15c3362b15e939"} Feb 20 09:30:56 crc kubenswrapper[5094]: I0220 09:30:56.826898 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" podStartSLOduration=2.428919335 podStartE2EDuration="2.826878229s" podCreationTimestamp="2026-02-20 09:30:54 +0000 UTC" firstStartedPulling="2026-02-20 09:30:55.791246794 +0000 UTC m=+9870.663873505" lastFinishedPulling="2026-02-20 09:30:56.189205688 +0000 UTC m=+9871.061832399" observedRunningTime="2026-02-20 09:30:56.816350146 +0000 UTC m=+9871.688976857" watchObservedRunningTime="2026-02-20 09:30:56.826878229 +0000 UTC m=+9871.699504940" Feb 20 09:31:07 crc kubenswrapper[5094]: I0220 09:31:07.840515 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:07 crc kubenswrapper[5094]: E0220 09:31:07.841927 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:31:22 crc kubenswrapper[5094]: I0220 09:31:22.840628 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:22 crc kubenswrapper[5094]: E0220 09:31:22.843440 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:31:35 crc kubenswrapper[5094]: I0220 09:31:35.851431 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:35 crc kubenswrapper[5094]: E0220 09:31:35.852326 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:31:46 crc kubenswrapper[5094]: I0220 09:31:46.840161 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:46 crc kubenswrapper[5094]: E0220 09:31:46.840900 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:31:58 crc kubenswrapper[5094]: I0220 09:31:58.416589 5094 generic.go:334] "Generic (PLEG): container finished" podID="f58790dc-4468-40ad-ba58-bb433a926abe" containerID="16bb3d0c2b07a856a2200802f3afc3601693f102f136e4654b9e478e3c583b2c" exitCode=0 Feb 20 09:31:58 crc kubenswrapper[5094]: I0220 09:31:58.416775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" event={"ID":"f58790dc-4468-40ad-ba58-bb433a926abe","Type":"ContainerDied","Data":"16bb3d0c2b07a856a2200802f3afc3601693f102f136e4654b9e478e3c583b2c"} Feb 20 09:31:59 crc kubenswrapper[5094]: I0220 09:31:59.840479 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:31:59 crc kubenswrapper[5094]: E0220 09:31:59.841102 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.028052 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.127830 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.127959 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.127999 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.128115 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.128226 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.128242 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") pod \"f58790dc-4468-40ad-ba58-bb433a926abe\" (UID: \"f58790dc-4468-40ad-ba58-bb433a926abe\") " Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.135899 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph" (OuterVolumeSpecName: "ceph") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.136050 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f" (OuterVolumeSpecName: "kube-api-access-dbb4f") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "kube-api-access-dbb4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.136229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.158401 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.172120 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.176485 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory" (OuterVolumeSpecName: "inventory") pod "f58790dc-4468-40ad-ba58-bb433a926abe" (UID: "f58790dc-4468-40ad-ba58-bb433a926abe"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230527 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230555 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230565 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230576 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbb4f\" (UniqueName: \"kubernetes.io/projected/f58790dc-4468-40ad-ba58-bb433a926abe-kube-api-access-dbb4f\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230586 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.230594 5094 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f58790dc-4468-40ad-ba58-bb433a926abe-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.439719 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" event={"ID":"f58790dc-4468-40ad-ba58-bb433a926abe","Type":"ContainerDied","Data":"e39920cb7c5a699ca2b8e232842e2e36ead718dc4047ce9c0e15c3362b15e939"} Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.439761 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e39920cb7c5a699ca2b8e232842e2e36ead718dc4047ce9c0e15c3362b15e939" Feb 20 09:32:00 crc kubenswrapper[5094]: I0220 09:32:00.439772 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-g7hl9" Feb 20 09:32:14 crc kubenswrapper[5094]: I0220 09:32:14.840259 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:32:14 crc kubenswrapper[5094]: E0220 09:32:14.841079 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:24 crc kubenswrapper[5094]: I0220 09:32:24.716811 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:24 crc kubenswrapper[5094]: I0220 09:32:24.717548 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" containerID="cri-o://fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.176907 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.177326 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerName="nova-cell1-conductor-conductor" containerID="cri-o://b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.369084 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.369295 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" containerID="cri-o://10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.394848 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.395109 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" containerID="cri-o://39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.395230 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" containerID="cri-o://c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.453088 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.453316 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" containerID="cri-o://c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.453795 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" containerID="cri-o://25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381" gracePeriod=30 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.727084 5094 generic.go:334] "Generic (PLEG): container finished" podID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerID="c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320" exitCode=143 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.727168 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerDied","Data":"c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320"} Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.730049 5094 generic.go:334] "Generic (PLEG): container finished" podID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerID="39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45" exitCode=143 Feb 20 09:32:25 crc kubenswrapper[5094]: I0220 09:32:25.730090 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerDied","Data":"39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45"} Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.371683 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.529033 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") pod \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.529170 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") pod \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.529231 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") pod \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\" (UID: \"ec04fa38-0d41-4c78-99fd-56299cd1c5ac\") " Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.534141 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8" (OuterVolumeSpecName: "kube-api-access-2d2s8") pod "ec04fa38-0d41-4c78-99fd-56299cd1c5ac" (UID: "ec04fa38-0d41-4c78-99fd-56299cd1c5ac"). InnerVolumeSpecName "kube-api-access-2d2s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.562910 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data" (OuterVolumeSpecName: "config-data") pod "ec04fa38-0d41-4c78-99fd-56299cd1c5ac" (UID: "ec04fa38-0d41-4c78-99fd-56299cd1c5ac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.565490 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec04fa38-0d41-4c78-99fd-56299cd1c5ac" (UID: "ec04fa38-0d41-4c78-99fd-56299cd1c5ac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.631343 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.631540 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d2s8\" (UniqueName: \"kubernetes.io/projected/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-kube-api-access-2d2s8\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.631599 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec04fa38-0d41-4c78-99fd-56299cd1c5ac-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740270 5094 generic.go:334] "Generic (PLEG): container finished" podID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerID="b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" exitCode=0 Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740310 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec04fa38-0d41-4c78-99fd-56299cd1c5ac","Type":"ContainerDied","Data":"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8"} Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740334 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"ec04fa38-0d41-4c78-99fd-56299cd1c5ac","Type":"ContainerDied","Data":"205874c2bd37603e15fe2e158ed7cb1e0b3a0637daa0a0e89a2d6fb0e765f899"} Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740349 5094 scope.go:117] "RemoveContainer" containerID="b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.740452 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.773863 5094 scope.go:117] "RemoveContainer" containerID="b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" Feb 20 09:32:26 crc kubenswrapper[5094]: E0220 09:32:26.774365 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8\": container with ID starting with b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8 not found: ID does not exist" containerID="b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.774417 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8"} err="failed to get container status \"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8\": rpc error: code = NotFound desc = could not find container \"b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8\": container with ID starting with b0e35cbe8bf849fece9161e859f69976c6d3a8bd0f88046bd05e34f439cfb7f8 not found: ID does not exist" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.780944 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.802134 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.818574 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:26 crc kubenswrapper[5094]: E0220 09:32:26.819160 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f58790dc-4468-40ad-ba58-bb433a926abe" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.819177 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f58790dc-4468-40ad-ba58-bb433a926abe" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 20 09:32:26 crc kubenswrapper[5094]: E0220 09:32:26.819241 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerName="nova-cell1-conductor-conductor" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.819253 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerName="nova-cell1-conductor-conductor" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.819491 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f58790dc-4468-40ad-ba58-bb433a926abe" containerName="neutron-dhcp-openstack-openstack-cell1" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.819531 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" containerName="nova-cell1-conductor-conductor" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.820421 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.827097 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.828752 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.835667 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.836070 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.836104 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5m6v\" (UniqueName: \"kubernetes.io/projected/23153570-19e2-4a29-9533-5db90a0c5d09-kube-api-access-v5m6v\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.937539 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.937718 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.937754 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5m6v\" (UniqueName: \"kubernetes.io/projected/23153570-19e2-4a29-9533-5db90a0c5d09-kube-api-access-v5m6v\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.949783 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.958867 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23153570-19e2-4a29-9533-5db90a0c5d09-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:26 crc kubenswrapper[5094]: I0220 09:32:26.972969 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5m6v\" (UniqueName: \"kubernetes.io/projected/23153570-19e2-4a29-9533-5db90a0c5d09-kube-api-access-v5m6v\") pod \"nova-cell1-conductor-0\" (UID: \"23153570-19e2-4a29-9533-5db90a0c5d09\") " pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:27 crc kubenswrapper[5094]: I0220 09:32:27.148097 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:27 crc kubenswrapper[5094]: I0220 09:32:27.646391 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 20 09:32:27 crc kubenswrapper[5094]: I0220 09:32:27.751599 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23153570-19e2-4a29-9533-5db90a0c5d09","Type":"ContainerStarted","Data":"c3f4d685a93958bf9224bacc59e5f1b58749b345e66218d6839bb67feec0373c"} Feb 20 09:32:27 crc kubenswrapper[5094]: I0220 09:32:27.855753 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec04fa38-0d41-4c78-99fd-56299cd1c5ac" path="/var/lib/kubelet/pods/ec04fa38-0d41-4c78-99fd-56299cd1c5ac/volumes" Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.787223 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"23153570-19e2-4a29-9533-5db90a0c5d09","Type":"ContainerStarted","Data":"82040ae5321ea360ff64c5dde93b028401d429a83a61f8db08baa6fd1a7397ac"} Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.787733 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.790237 5094 generic.go:334] "Generic (PLEG): container finished" podID="02305b70-64d3-46af-876a-f81d73f83cbf" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" exitCode=0 Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.790320 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"02305b70-64d3-46af-876a-f81d73f83cbf","Type":"ContainerDied","Data":"fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b"} Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.819278 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.81925231 podStartE2EDuration="2.81925231s" podCreationTimestamp="2026-02-20 09:32:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:28.801918903 +0000 UTC m=+9963.674545624" watchObservedRunningTime="2026-02-20 09:32:28.81925231 +0000 UTC m=+9963.691879031" Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.873142 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": read tcp 10.217.0.2:34240->10.217.1.87:8775: read: connection reset by peer" Feb 20 09:32:28 crc kubenswrapper[5094]: I0220 09:32:28.873117 5094 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.87:8775/\": read tcp 10.217.0.2:34248->10.217.1.87:8775: read: connection reset by peer" Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.000024 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b is running failed: container process not found" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.000559 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b is running failed: container process not found" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.001005 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b is running failed: container process not found" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.001038 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.730181 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.807241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"02305b70-64d3-46af-876a-f81d73f83cbf","Type":"ContainerDied","Data":"b27e92493a93d647968058e4cfe443d6348c867ece948ce72e75c01521bbc434"} Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.807573 5094 scope.go:117] "RemoveContainer" containerID="fec92b2a8da1e2997bd3d8a328f8f1e86063ef10d1fd953772b4ae368e9d524b" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.807732 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.814433 5094 generic.go:334] "Generic (PLEG): container finished" podID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerID="25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381" exitCode=0 Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.814498 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerDied","Data":"25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381"} Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.816573 5094 generic.go:334] "Generic (PLEG): container finished" podID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerID="c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509" exitCode=0 Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.817086 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerDied","Data":"c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509"} Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.843027 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:32:29 crc kubenswrapper[5094]: E0220 09:32:29.846191 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.903314 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") pod \"02305b70-64d3-46af-876a-f81d73f83cbf\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.903400 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") pod \"02305b70-64d3-46af-876a-f81d73f83cbf\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.903478 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") pod \"02305b70-64d3-46af-876a-f81d73f83cbf\" (UID: \"02305b70-64d3-46af-876a-f81d73f83cbf\") " Feb 20 09:32:29 crc kubenswrapper[5094]: I0220 09:32:29.918413 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747" (OuterVolumeSpecName: "kube-api-access-wk747") pod "02305b70-64d3-46af-876a-f81d73f83cbf" (UID: "02305b70-64d3-46af-876a-f81d73f83cbf"). InnerVolumeSpecName "kube-api-access-wk747". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.006267 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk747\" (UniqueName: \"kubernetes.io/projected/02305b70-64d3-46af-876a-f81d73f83cbf-kube-api-access-wk747\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.043551 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data" (OuterVolumeSpecName: "config-data") pod "02305b70-64d3-46af-876a-f81d73f83cbf" (UID: "02305b70-64d3-46af-876a-f81d73f83cbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.054950 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02305b70-64d3-46af-876a-f81d73f83cbf" (UID: "02305b70-64d3-46af-876a-f81d73f83cbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.110087 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.110123 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02305b70-64d3-46af-876a-f81d73f83cbf-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.150497 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f is running failed: container process not found" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.151249 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f is running failed: container process not found" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.155028 5094 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f is running failed: container process not found" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.155070 5094 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.203264 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.213451 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") pod \"1323ed20-0605-4081-a36d-6fa8c40f26e6\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.213508 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") pod \"1323ed20-0605-4081-a36d-6fa8c40f26e6\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.213606 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") pod \"1323ed20-0605-4081-a36d-6fa8c40f26e6\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.213680 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") pod \"1323ed20-0605-4081-a36d-6fa8c40f26e6\" (UID: \"1323ed20-0605-4081-a36d-6fa8c40f26e6\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.214189 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs" (OuterVolumeSpecName: "logs") pod "1323ed20-0605-4081-a36d-6fa8c40f26e6" (UID: "1323ed20-0605-4081-a36d-6fa8c40f26e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.214786 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1323ed20-0605-4081-a36d-6fa8c40f26e6-logs\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.218739 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.223528 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.228262 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk" (OuterVolumeSpecName: "kube-api-access-dcbxk") pod "1323ed20-0605-4081-a36d-6fa8c40f26e6" (UID: "1323ed20-0605-4081-a36d-6fa8c40f26e6"). InnerVolumeSpecName "kube-api-access-dcbxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.241120 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.262463 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.262963 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.262976 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.262988 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.262994 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.263014 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263020 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.263047 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263053 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" Feb 20 09:32:30 crc kubenswrapper[5094]: E0220 09:32:30.263070 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263081 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263290 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-log" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263305 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" containerName="nova-cell0-conductor-conductor" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263320 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-log" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263333 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" containerName="nova-metadata-metadata" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.263350 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" containerName="nova-api-api" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.264060 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.270225 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.284863 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.293500 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1323ed20-0605-4081-a36d-6fa8c40f26e6" (UID: "1323ed20-0605-4081-a36d-6fa8c40f26e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.307877 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data" (OuterVolumeSpecName: "config-data") pod "1323ed20-0605-4081-a36d-6fa8c40f26e6" (UID: "1323ed20-0605-4081-a36d-6fa8c40f26e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317427 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") pod \"4c1b5836-3f97-4ae2-a894-e42a72b29729\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317519 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") pod \"4c1b5836-3f97-4ae2-a894-e42a72b29729\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317570 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") pod \"4c1b5836-3f97-4ae2-a894-e42a72b29729\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317601 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") pod \"4c1b5836-3f97-4ae2-a894-e42a72b29729\" (UID: \"4c1b5836-3f97-4ae2-a894-e42a72b29729\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.317978 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mktbp\" (UniqueName: \"kubernetes.io/projected/9ae582b9-3951-4670-91bf-5d044269ff1c-kube-api-access-mktbp\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.318086 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.318192 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.320298 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv" (OuterVolumeSpecName: "kube-api-access-ltthv") pod "4c1b5836-3f97-4ae2-a894-e42a72b29729" (UID: "4c1b5836-3f97-4ae2-a894-e42a72b29729"). InnerVolumeSpecName "kube-api-access-ltthv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.321115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs" (OuterVolumeSpecName: "logs") pod "4c1b5836-3f97-4ae2-a894-e42a72b29729" (UID: "4c1b5836-3f97-4ae2-a894-e42a72b29729"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.318361 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcbxk\" (UniqueName: \"kubernetes.io/projected/1323ed20-0605-4081-a36d-6fa8c40f26e6-kube-api-access-dcbxk\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.322763 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.322775 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1323ed20-0605-4081-a36d-6fa8c40f26e6-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.354342 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data" (OuterVolumeSpecName: "config-data") pod "4c1b5836-3f97-4ae2-a894-e42a72b29729" (UID: "4c1b5836-3f97-4ae2-a894-e42a72b29729"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.354366 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c1b5836-3f97-4ae2-a894-e42a72b29729" (UID: "4c1b5836-3f97-4ae2-a894-e42a72b29729"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.384910 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.423604 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") pod \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.423712 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") pod \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.423795 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") pod \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\" (UID: \"80d2f807-a13f-4a1d-93d3-293d1afd6e4c\") " Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.423974 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mktbp\" (UniqueName: \"kubernetes.io/projected/9ae582b9-3951-4670-91bf-5d044269ff1c-kube-api-access-mktbp\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424033 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424098 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424186 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424196 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c1b5836-3f97-4ae2-a894-e42a72b29729-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424206 5094 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c1b5836-3f97-4ae2-a894-e42a72b29729-logs\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.424215 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltthv\" (UniqueName: \"kubernetes.io/projected/4c1b5836-3f97-4ae2-a894-e42a72b29729-kube-api-access-ltthv\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.430372 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.433999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr" (OuterVolumeSpecName: "kube-api-access-t7kqr") pod "80d2f807-a13f-4a1d-93d3-293d1afd6e4c" (UID: "80d2f807-a13f-4a1d-93d3-293d1afd6e4c"). InnerVolumeSpecName "kube-api-access-t7kqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.435765 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ae582b9-3951-4670-91bf-5d044269ff1c-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.439974 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mktbp\" (UniqueName: \"kubernetes.io/projected/9ae582b9-3951-4670-91bf-5d044269ff1c-kube-api-access-mktbp\") pod \"nova-cell0-conductor-0\" (UID: \"9ae582b9-3951-4670-91bf-5d044269ff1c\") " pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.452270 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80d2f807-a13f-4a1d-93d3-293d1afd6e4c" (UID: "80d2f807-a13f-4a1d-93d3-293d1afd6e4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.456844 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data" (OuterVolumeSpecName: "config-data") pod "80d2f807-a13f-4a1d-93d3-293d1afd6e4c" (UID: "80d2f807-a13f-4a1d-93d3-293d1afd6e4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.526895 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7kqr\" (UniqueName: \"kubernetes.io/projected/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-kube-api-access-t7kqr\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.526953 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.526966 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d2f807-a13f-4a1d-93d3-293d1afd6e4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.597224 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.835246 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1323ed20-0605-4081-a36d-6fa8c40f26e6","Type":"ContainerDied","Data":"d585671e2fde2c389818c568ec8f701d1f0c341b00acbfaa339458d079916a62"} Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.835310 5094 scope.go:117] "RemoveContainer" containerID="25dc2efe543de2f343b425911821a1eda1c851282daa752a1b28d46a6d470381" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.835415 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.845531 5094 generic.go:334] "Generic (PLEG): container finished" podID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" exitCode=0 Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.845631 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d2f807-a13f-4a1d-93d3-293d1afd6e4c","Type":"ContainerDied","Data":"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f"} Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.845687 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"80d2f807-a13f-4a1d-93d3-293d1afd6e4c","Type":"ContainerDied","Data":"364b3455721cb71cf6727cbca688257580fa376c421f75064e6d61cba30218c8"} Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.845788 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.853332 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4c1b5836-3f97-4ae2-a894-e42a72b29729","Type":"ContainerDied","Data":"867e153f6129e6c09f4b4a68b08d0ac6938b5f39543d1e08a62fd3fdae93737c"} Feb 20 09:32:30 crc kubenswrapper[5094]: I0220 09:32:30.853420 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.078511 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.403613 5094 scope.go:117] "RemoveContainer" containerID="c29b5297edcb6838d15c351b78051941a64fc614165717a744eca6ab46c45320" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.433259 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.444210 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.457858 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.482753 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.506797 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: E0220 09:32:31.507289 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.507310 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.507580 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" containerName="nova-scheduler-scheduler" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.508365 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.514782 5094 scope.go:117] "RemoveContainer" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.518640 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.529097 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.584258 5094 scope.go:117] "RemoveContainer" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" Feb 20 09:32:31 crc kubenswrapper[5094]: E0220 09:32:31.586371 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f\": container with ID starting with 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f not found: ID does not exist" containerID="10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.586421 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f"} err="failed to get container status \"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f\": rpc error: code = NotFound desc = could not find container \"10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f\": container with ID starting with 10c1b60e5eee2c4d42b474a9b5b555e820623ece35605ababb7c372f5fd4910f not found: ID does not exist" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.586466 5094 scope.go:117] "RemoveContainer" containerID="c876bb36d6cd6351bc15c4941fbdd419137b0dffee9422b0eca06e2602e21509" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.602374 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.617171 5094 scope.go:117] "RemoveContainer" containerID="39a13020f662d3e6609d4881367424f2f25adb683ad3729bfa3b75921443ae45" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.621105 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.631758 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.633729 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.637356 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.642957 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.656020 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.657938 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.659890 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-config-data\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.659954 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.659992 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lv2cw\" (UniqueName: \"kubernetes.io/projected/67a3bd12-be26-46a3-bd66-982bea39049a-kube-api-access-lv2cw\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.661997 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.665898 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761382 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761433 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx8fk\" (UniqueName: \"kubernetes.io/projected/b6886613-4f07-498a-911f-4d77704ab4df-kube-api-access-sx8fk\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761470 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lv2cw\" (UniqueName: \"kubernetes.io/projected/67a3bd12-be26-46a3-bd66-982bea39049a-kube-api-access-lv2cw\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761523 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-config-data\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761551 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761583 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-config-data\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761655 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-config-data\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761678 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6886613-4f07-498a-911f-4d77704ab4df-logs\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761693 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f99e6f-46a8-4a46-bcae-81947aa95700-logs\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761732 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.761753 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxfv6\" (UniqueName: \"kubernetes.io/projected/87f99e6f-46a8-4a46-bcae-81947aa95700-kube-api-access-pxfv6\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.769653 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-config-data\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.770278 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67a3bd12-be26-46a3-bd66-982bea39049a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.778462 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lv2cw\" (UniqueName: \"kubernetes.io/projected/67a3bd12-be26-46a3-bd66-982bea39049a-kube-api-access-lv2cw\") pod \"nova-scheduler-0\" (UID: \"67a3bd12-be26-46a3-bd66-982bea39049a\") " pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.837056 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.856989 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02305b70-64d3-46af-876a-f81d73f83cbf" path="/var/lib/kubelet/pods/02305b70-64d3-46af-876a-f81d73f83cbf/volumes" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.857643 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1323ed20-0605-4081-a36d-6fa8c40f26e6" path="/var/lib/kubelet/pods/1323ed20-0605-4081-a36d-6fa8c40f26e6/volumes" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.858918 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c1b5836-3f97-4ae2-a894-e42a72b29729" path="/var/lib/kubelet/pods/4c1b5836-3f97-4ae2-a894-e42a72b29729/volumes" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.860022 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80d2f807-a13f-4a1d-93d3-293d1afd6e4c" path="/var/lib/kubelet/pods/80d2f807-a13f-4a1d-93d3-293d1afd6e4c/volumes" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863063 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-config-data\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863122 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863162 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-config-data\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863237 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6886613-4f07-498a-911f-4d77704ab4df-logs\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863256 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f99e6f-46a8-4a46-bcae-81947aa95700-logs\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863283 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.863900 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6886613-4f07-498a-911f-4d77704ab4df-logs\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.864088 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f99e6f-46a8-4a46-bcae-81947aa95700-logs\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.864168 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxfv6\" (UniqueName: \"kubernetes.io/projected/87f99e6f-46a8-4a46-bcae-81947aa95700-kube-api-access-pxfv6\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.864193 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx8fk\" (UniqueName: \"kubernetes.io/projected/b6886613-4f07-498a-911f-4d77704ab4df-kube-api-access-sx8fk\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.867873 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.867934 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-config-data\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.867997 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6886613-4f07-498a-911f-4d77704ab4df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.873520 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87f99e6f-46a8-4a46-bcae-81947aa95700-config-data\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.881362 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9ae582b9-3951-4670-91bf-5d044269ff1c","Type":"ContainerStarted","Data":"dde8a975f326a4ab0f78730b58e14cf202d320584e2e32873e4c8b3c0cbe1f90"} Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.881408 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"9ae582b9-3951-4670-91bf-5d044269ff1c","Type":"ContainerStarted","Data":"379bbcf599c0781990628e9389811419bb8aab99ec5f6dfc156931ce84ca14e9"} Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.882659 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.882691 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxfv6\" (UniqueName: \"kubernetes.io/projected/87f99e6f-46a8-4a46-bcae-81947aa95700-kube-api-access-pxfv6\") pod \"nova-metadata-0\" (UID: \"87f99e6f-46a8-4a46-bcae-81947aa95700\") " pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.903075 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx8fk\" (UniqueName: \"kubernetes.io/projected/b6886613-4f07-498a-911f-4d77704ab4df-kube-api-access-sx8fk\") pod \"nova-api-0\" (UID: \"b6886613-4f07-498a-911f-4d77704ab4df\") " pod="openstack/nova-api-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.922203 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.922173071 podStartE2EDuration="1.922173071s" podCreationTimestamp="2026-02-20 09:32:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:31.915031939 +0000 UTC m=+9966.787658650" watchObservedRunningTime="2026-02-20 09:32:31.922173071 +0000 UTC m=+9966.794799782" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.954197 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 20 09:32:31 crc kubenswrapper[5094]: I0220 09:32:31.988972 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.186210 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.309605 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 20 09:32:32 crc kubenswrapper[5094]: W0220 09:32:32.311082 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67a3bd12_be26_46a3_bd66_982bea39049a.slice/crio-5ceac1d8107be32eeca410c91e33a4c53f7aab6c3ca4b2f06667102f548f5507 WatchSource:0}: Error finding container 5ceac1d8107be32eeca410c91e33a4c53f7aab6c3ca4b2f06667102f548f5507: Status 404 returned error can't find the container with id 5ceac1d8107be32eeca410c91e33a4c53f7aab6c3ca4b2f06667102f548f5507 Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.431853 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.545418 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 20 09:32:32 crc kubenswrapper[5094]: W0220 09:32:32.553093 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6886613_4f07_498a_911f_4d77704ab4df.slice/crio-5790f0b4bc606143a9bcd6c51144d155191868038b03c26011c247967da74145 WatchSource:0}: Error finding container 5790f0b4bc606143a9bcd6c51144d155191868038b03c26011c247967da74145: Status 404 returned error can't find the container with id 5790f0b4bc606143a9bcd6c51144d155191868038b03c26011c247967da74145 Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.896655 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6886613-4f07-498a-911f-4d77704ab4df","Type":"ContainerStarted","Data":"995a9223f6e307afd506adc22a61e1152c3ee31123de0c92dc4dac150289ab55"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.896710 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6886613-4f07-498a-911f-4d77704ab4df","Type":"ContainerStarted","Data":"5790f0b4bc606143a9bcd6c51144d155191868038b03c26011c247967da74145"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.900364 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67a3bd12-be26-46a3-bd66-982bea39049a","Type":"ContainerStarted","Data":"35716fc27f82e5b6c12e47b0adf36caf429732bd9f4df54a7314bb44da06331b"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.900442 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"67a3bd12-be26-46a3-bd66-982bea39049a","Type":"ContainerStarted","Data":"5ceac1d8107be32eeca410c91e33a4c53f7aab6c3ca4b2f06667102f548f5507"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.903532 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87f99e6f-46a8-4a46-bcae-81947aa95700","Type":"ContainerStarted","Data":"9b891c6114fffd97f16c1211284ae3ba50954686c250967b8979eb57c794377d"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.904315 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87f99e6f-46a8-4a46-bcae-81947aa95700","Type":"ContainerStarted","Data":"69f5327d4e2e931dbb3e097c3ae567eac64dee32b5e10553719964bfaea7d383"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.904332 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"87f99e6f-46a8-4a46-bcae-81947aa95700","Type":"ContainerStarted","Data":"a08cf42a633180684561dacd926e08fcf523aa424706ca2a8b1e45e2e4e16bc8"} Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.920635 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.920618781 podStartE2EDuration="1.920618781s" podCreationTimestamp="2026-02-20 09:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:32.914614157 +0000 UTC m=+9967.787240868" watchObservedRunningTime="2026-02-20 09:32:32.920618781 +0000 UTC m=+9967.793245492" Feb 20 09:32:32 crc kubenswrapper[5094]: I0220 09:32:32.951697 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9516762779999999 podStartE2EDuration="1.951676278s" podCreationTimestamp="2026-02-20 09:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:32.939125756 +0000 UTC m=+9967.811752467" watchObservedRunningTime="2026-02-20 09:32:32.951676278 +0000 UTC m=+9967.824303009" Feb 20 09:32:33 crc kubenswrapper[5094]: I0220 09:32:33.937308 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b6886613-4f07-498a-911f-4d77704ab4df","Type":"ContainerStarted","Data":"84e9062ecb98a3cfd9caef3e7bd887d3c643f0c5dc9d7f9a7a51d5b91793e59f"} Feb 20 09:32:33 crc kubenswrapper[5094]: I0220 09:32:33.977857 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.977834516 podStartE2EDuration="2.977834516s" podCreationTimestamp="2026-02-20 09:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 09:32:33.967473727 +0000 UTC m=+9968.840100458" watchObservedRunningTime="2026-02-20 09:32:33.977834516 +0000 UTC m=+9968.850461257" Feb 20 09:32:36 crc kubenswrapper[5094]: I0220 09:32:36.838785 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 20 09:32:36 crc kubenswrapper[5094]: I0220 09:32:36.954391 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 09:32:36 crc kubenswrapper[5094]: I0220 09:32:36.954431 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 20 09:32:40 crc kubenswrapper[5094]: I0220 09:32:40.656677 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.838677 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.954884 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.955232 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.977852 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.989783 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 09:32:41 crc kubenswrapper[5094]: I0220 09:32:41.989834 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 20 09:32:42 crc kubenswrapper[5094]: I0220 09:32:42.050318 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 20 09:32:42 crc kubenswrapper[5094]: I0220 09:32:42.840587 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:32:42 crc kubenswrapper[5094]: E0220 09:32:42.841152 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:43 crc kubenswrapper[5094]: I0220 09:32:43.121826 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b6886613-4f07-498a-911f-4d77704ab4df" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 09:32:43 crc kubenswrapper[5094]: I0220 09:32:43.121864 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="87f99e6f-46a8-4a46-bcae-81947aa95700" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.189:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 09:32:43 crc kubenswrapper[5094]: I0220 09:32:43.121813 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="87f99e6f-46a8-4a46-bcae-81947aa95700" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.189:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 09:32:43 crc kubenswrapper[5094]: I0220 09:32:43.122201 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b6886613-4f07-498a-911f-4d77704ab4df" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.190:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.426043 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.428762 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.437163 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.529415 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.529469 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.529502 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.631552 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.631591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.631613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.632136 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.632139 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.650308 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") pod \"redhat-operators-km8mn\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.764476 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.957726 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.957995 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.960057 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 09:32:51 crc kubenswrapper[5094]: I0220 09:32:51.962228 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.000383 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.001576 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.012360 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.016513 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.123990 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.133367 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 20 09:32:52 crc kubenswrapper[5094]: I0220 09:32:52.263682 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.133213 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerID="2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1" exitCode=0 Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.133404 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerDied","Data":"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1"} Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.134649 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerStarted","Data":"610c159ff287bb67be257ddc8c95e160d06d6ef93bad3fd074b27d4eb4beeeda"} Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.891933 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd"] Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.893807 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.898633 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.898888 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899028 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899253 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899368 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-dj9w6" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899551 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.899724 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 20 09:32:53 crc kubenswrapper[5094]: I0220 09:32:53.949390 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd"] Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.003836 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.003926 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.003954 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.003976 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004062 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004086 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004110 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004132 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004151 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004174 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004220 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.004238 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.106751 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107239 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107276 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107304 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107341 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107361 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107385 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107414 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107437 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107464 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107547 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107578 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.107603 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.110540 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.110552 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.114781 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.114967 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.115085 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.115728 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.117221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.119973 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.120564 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.128759 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.129745 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.131594 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.134752 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.152486 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerStarted","Data":"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688"} Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.267556 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:32:54 crc kubenswrapper[5094]: I0220 09:32:54.894480 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd"] Feb 20 09:32:54 crc kubenswrapper[5094]: W0220 09:32:54.900087 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b30b185_0b70_4ad8_8eca_a292b76fb410.slice/crio-135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7 WatchSource:0}: Error finding container 135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7: Status 404 returned error can't find the container with id 135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7 Feb 20 09:32:55 crc kubenswrapper[5094]: I0220 09:32:55.165914 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" event={"ID":"5b30b185-0b70-4ad8-8eca-a292b76fb410","Type":"ContainerStarted","Data":"135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7"} Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.193762 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerID="cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688" exitCode=0 Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.193836 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerDied","Data":"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688"} Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.197742 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" event={"ID":"5b30b185-0b70-4ad8-8eca-a292b76fb410","Type":"ContainerStarted","Data":"40b6b847514554d6d2915d390a68f23820ddde7e93d36e6d67b4ad209729e928"} Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.250752 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" podStartSLOduration=2.766982284 podStartE2EDuration="3.250735901s" podCreationTimestamp="2026-02-20 09:32:53 +0000 UTC" firstStartedPulling="2026-02-20 09:32:54.902455024 +0000 UTC m=+9989.775081735" lastFinishedPulling="2026-02-20 09:32:55.386208641 +0000 UTC m=+9990.258835352" observedRunningTime="2026-02-20 09:32:56.246396406 +0000 UTC m=+9991.119023117" watchObservedRunningTime="2026-02-20 09:32:56.250735901 +0000 UTC m=+9991.123362612" Feb 20 09:32:56 crc kubenswrapper[5094]: I0220 09:32:56.841052 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:32:56 crc kubenswrapper[5094]: E0220 09:32:56.841573 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:32:58 crc kubenswrapper[5094]: I0220 09:32:58.219539 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerStarted","Data":"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61"} Feb 20 09:32:58 crc kubenswrapper[5094]: I0220 09:32:58.249844 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-km8mn" podStartSLOduration=3.419647218 podStartE2EDuration="7.249823154s" podCreationTimestamp="2026-02-20 09:32:51 +0000 UTC" firstStartedPulling="2026-02-20 09:32:53.136659142 +0000 UTC m=+9988.009285853" lastFinishedPulling="2026-02-20 09:32:56.966835088 +0000 UTC m=+9991.839461789" observedRunningTime="2026-02-20 09:32:58.241546866 +0000 UTC m=+9993.114173597" watchObservedRunningTime="2026-02-20 09:32:58.249823154 +0000 UTC m=+9993.122449865" Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.801624 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.812877 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.863109 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.955733 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.955899 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:32:59 crc kubenswrapper[5094]: I0220 09:32:59.956256 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.058595 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.058776 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.058870 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.059385 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.059472 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.081363 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") pod \"redhat-marketplace-h6jzn\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.141599 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:00 crc kubenswrapper[5094]: I0220 09:33:00.645118 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:33:01 crc kubenswrapper[5094]: I0220 09:33:01.255105 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerStarted","Data":"851258ed832aa014433a1a37cc6b0d08924a30fb97018dfda95e09099f934eaf"} Feb 20 09:33:01 crc kubenswrapper[5094]: I0220 09:33:01.765433 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:01 crc kubenswrapper[5094]: I0220 09:33:01.765488 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:02 crc kubenswrapper[5094]: I0220 09:33:02.265314 5094 generic.go:334] "Generic (PLEG): container finished" podID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerID="6a950e737c27d27ac2b886f1b67fb7e0f1d51bad1dc3e086671beee3ea9e99a6" exitCode=0 Feb 20 09:33:02 crc kubenswrapper[5094]: I0220 09:33:02.265412 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerDied","Data":"6a950e737c27d27ac2b886f1b67fb7e0f1d51bad1dc3e086671beee3ea9e99a6"} Feb 20 09:33:02 crc kubenswrapper[5094]: I0220 09:33:02.810030 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-km8mn" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" probeResult="failure" output=< Feb 20 09:33:02 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:33:02 crc kubenswrapper[5094]: > Feb 20 09:33:04 crc kubenswrapper[5094]: I0220 09:33:04.297882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerStarted","Data":"d6bd787107531ff3e1a19803fc9c847eccec4b85fefe5b5db8cb283afc2a9c70"} Feb 20 09:33:05 crc kubenswrapper[5094]: I0220 09:33:05.308396 5094 generic.go:334] "Generic (PLEG): container finished" podID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerID="d6bd787107531ff3e1a19803fc9c847eccec4b85fefe5b5db8cb283afc2a9c70" exitCode=0 Feb 20 09:33:05 crc kubenswrapper[5094]: I0220 09:33:05.308436 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerDied","Data":"d6bd787107531ff3e1a19803fc9c847eccec4b85fefe5b5db8cb283afc2a9c70"} Feb 20 09:33:06 crc kubenswrapper[5094]: I0220 09:33:06.320089 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerStarted","Data":"0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef"} Feb 20 09:33:06 crc kubenswrapper[5094]: I0220 09:33:06.339627 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h6jzn" podStartSLOduration=3.9230232320000002 podStartE2EDuration="7.339606899s" podCreationTimestamp="2026-02-20 09:32:59 +0000 UTC" firstStartedPulling="2026-02-20 09:33:02.267357289 +0000 UTC m=+9997.139984000" lastFinishedPulling="2026-02-20 09:33:05.683940966 +0000 UTC m=+10000.556567667" observedRunningTime="2026-02-20 09:33:06.337896339 +0000 UTC m=+10001.210523050" watchObservedRunningTime="2026-02-20 09:33:06.339606899 +0000 UTC m=+10001.212233610" Feb 20 09:33:08 crc kubenswrapper[5094]: I0220 09:33:08.840542 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:08 crc kubenswrapper[5094]: E0220 09:33:08.841208 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.142813 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.142893 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.199338 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.407400 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:10 crc kubenswrapper[5094]: I0220 09:33:10.462689 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:33:11 crc kubenswrapper[5094]: I0220 09:33:11.813076 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:11 crc kubenswrapper[5094]: I0220 09:33:11.872047 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:12 crc kubenswrapper[5094]: I0220 09:33:12.376789 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h6jzn" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="registry-server" containerID="cri-o://0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef" gracePeriod=2 Feb 20 09:33:12 crc kubenswrapper[5094]: I0220 09:33:12.851215 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:33:13 crc kubenswrapper[5094]: I0220 09:33:13.394930 5094 generic.go:334] "Generic (PLEG): container finished" podID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerID="0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef" exitCode=0 Feb 20 09:33:13 crc kubenswrapper[5094]: I0220 09:33:13.395006 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerDied","Data":"0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef"} Feb 20 09:33:13 crc kubenswrapper[5094]: I0220 09:33:13.395159 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-km8mn" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" containerID="cri-o://34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" gracePeriod=2 Feb 20 09:33:13 crc kubenswrapper[5094]: E0220 09:33:13.550867 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ef20e3_709d_4a1f_a616_c0259cebabd5.slice/crio-34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61.scope\": RecentStats: unable to find data in memory cache]" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.007938 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.017098 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160315 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") pod \"b9215716-a0b8-42e4-8f60-abbd516f91d6\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160375 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") pod \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160494 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") pod \"b9215716-a0b8-42e4-8f60-abbd516f91d6\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160531 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") pod \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160596 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") pod \"b9215716-a0b8-42e4-8f60-abbd516f91d6\" (UID: \"b9215716-a0b8-42e4-8f60-abbd516f91d6\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.160640 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") pod \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\" (UID: \"a6ef20e3-709d-4a1f-a616-c0259cebabd5\") " Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.161364 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities" (OuterVolumeSpecName: "utilities") pod "a6ef20e3-709d-4a1f-a616-c0259cebabd5" (UID: "a6ef20e3-709d-4a1f-a616-c0259cebabd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.161494 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities" (OuterVolumeSpecName: "utilities") pod "b9215716-a0b8-42e4-8f60-abbd516f91d6" (UID: "b9215716-a0b8-42e4-8f60-abbd516f91d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.162203 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.162248 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.165803 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b" (OuterVolumeSpecName: "kube-api-access-9mk2b") pod "a6ef20e3-709d-4a1f-a616-c0259cebabd5" (UID: "a6ef20e3-709d-4a1f-a616-c0259cebabd5"). InnerVolumeSpecName "kube-api-access-9mk2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.174220 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v" (OuterVolumeSpecName: "kube-api-access-g9x2v") pod "b9215716-a0b8-42e4-8f60-abbd516f91d6" (UID: "b9215716-a0b8-42e4-8f60-abbd516f91d6"). InnerVolumeSpecName "kube-api-access-g9x2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.184053 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b9215716-a0b8-42e4-8f60-abbd516f91d6" (UID: "b9215716-a0b8-42e4-8f60-abbd516f91d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.264510 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9215716-a0b8-42e4-8f60-abbd516f91d6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.264840 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mk2b\" (UniqueName: \"kubernetes.io/projected/a6ef20e3-709d-4a1f-a616-c0259cebabd5-kube-api-access-9mk2b\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.264850 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9x2v\" (UniqueName: \"kubernetes.io/projected/b9215716-a0b8-42e4-8f60-abbd516f91d6-kube-api-access-g9x2v\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.304936 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6ef20e3-709d-4a1f-a616-c0259cebabd5" (UID: "a6ef20e3-709d-4a1f-a616-c0259cebabd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.367659 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6ef20e3-709d-4a1f-a616-c0259cebabd5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.407263 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h6jzn" event={"ID":"b9215716-a0b8-42e4-8f60-abbd516f91d6","Type":"ContainerDied","Data":"851258ed832aa014433a1a37cc6b0d08924a30fb97018dfda95e09099f934eaf"} Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.407342 5094 scope.go:117] "RemoveContainer" containerID="0777df51b8320dd938ae97fa80baf455fc887a581be50f90d9d3e1daa391ebef" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.407392 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h6jzn" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.411036 5094 generic.go:334] "Generic (PLEG): container finished" podID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerID="34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" exitCode=0 Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.411067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerDied","Data":"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61"} Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.411091 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-km8mn" event={"ID":"a6ef20e3-709d-4a1f-a616-c0259cebabd5","Type":"ContainerDied","Data":"610c159ff287bb67be257ddc8c95e160d06d6ef93bad3fd074b27d4eb4beeeda"} Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.411156 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-km8mn" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.434209 5094 scope.go:117] "RemoveContainer" containerID="d6bd787107531ff3e1a19803fc9c847eccec4b85fefe5b5db8cb283afc2a9c70" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.464639 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.481820 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h6jzn"] Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.482349 5094 scope.go:117] "RemoveContainer" containerID="6a950e737c27d27ac2b886f1b67fb7e0f1d51bad1dc3e086671beee3ea9e99a6" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.492789 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.515073 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-km8mn"] Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.515458 5094 scope.go:117] "RemoveContainer" containerID="34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.558904 5094 scope.go:117] "RemoveContainer" containerID="cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.603866 5094 scope.go:117] "RemoveContainer" containerID="2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.628722 5094 scope.go:117] "RemoveContainer" containerID="34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" Feb 20 09:33:14 crc kubenswrapper[5094]: E0220 09:33:14.629329 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61\": container with ID starting with 34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61 not found: ID does not exist" containerID="34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.629390 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61"} err="failed to get container status \"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61\": rpc error: code = NotFound desc = could not find container \"34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61\": container with ID starting with 34b6d19292b35673a24307d48a3d1c8c037f0913c16f64c2be2b06c25807eb61 not found: ID does not exist" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.629417 5094 scope.go:117] "RemoveContainer" containerID="cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688" Feb 20 09:33:14 crc kubenswrapper[5094]: E0220 09:33:14.629997 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688\": container with ID starting with cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688 not found: ID does not exist" containerID="cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.630024 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688"} err="failed to get container status \"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688\": rpc error: code = NotFound desc = could not find container \"cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688\": container with ID starting with cd68c7deed63881e78ecfc4f6731d253a24fb0b5eac0e138fa34f8befa830688 not found: ID does not exist" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.630047 5094 scope.go:117] "RemoveContainer" containerID="2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1" Feb 20 09:33:14 crc kubenswrapper[5094]: E0220 09:33:14.630423 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1\": container with ID starting with 2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1 not found: ID does not exist" containerID="2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1" Feb 20 09:33:14 crc kubenswrapper[5094]: I0220 09:33:14.630443 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1"} err="failed to get container status \"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1\": rpc error: code = NotFound desc = could not find container \"2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1\": container with ID starting with 2ccdeeaf4b6cf33e08f15c08b13eee5499adef516f06783b99da2954f2a56ff1 not found: ID does not exist" Feb 20 09:33:15 crc kubenswrapper[5094]: I0220 09:33:15.854671 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" path="/var/lib/kubelet/pods/a6ef20e3-709d-4a1f-a616-c0259cebabd5/volumes" Feb 20 09:33:15 crc kubenswrapper[5094]: I0220 09:33:15.855781 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" path="/var/lib/kubelet/pods/b9215716-a0b8-42e4-8f60-abbd516f91d6/volumes" Feb 20 09:33:19 crc kubenswrapper[5094]: I0220 09:33:19.840501 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:19 crc kubenswrapper[5094]: E0220 09:33:19.841184 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:31 crc kubenswrapper[5094]: I0220 09:33:31.840279 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:31 crc kubenswrapper[5094]: E0220 09:33:31.841059 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:44 crc kubenswrapper[5094]: I0220 09:33:44.840625 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:44 crc kubenswrapper[5094]: E0220 09:33:44.841517 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.491718 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qk79m"] Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492827 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="extract-content" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492845 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="extract-content" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492862 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="extract-utilities" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492871 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="extract-utilities" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492901 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="extract-content" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492910 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="extract-content" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492924 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492932 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492966 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="extract-utilities" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.492974 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="extract-utilities" Feb 20 09:33:52 crc kubenswrapper[5094]: E0220 09:33:52.492992 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.493002 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.493266 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ef20e3-709d-4a1f-a616-c0259cebabd5" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.493292 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9215716-a0b8-42e4-8f60-abbd516f91d6" containerName="registry-server" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.495398 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.515999 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qk79m"] Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.597831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-utilities\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.597900 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbfwz\" (UniqueName: \"kubernetes.io/projected/67d20448-086a-4d76-b547-768f68c018f2-kube-api-access-pbfwz\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.598035 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-catalog-content\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.700012 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-catalog-content\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.700239 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-utilities\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.700284 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbfwz\" (UniqueName: \"kubernetes.io/projected/67d20448-086a-4d76-b547-768f68c018f2-kube-api-access-pbfwz\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.700786 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-catalog-content\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.701405 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67d20448-086a-4d76-b547-768f68c018f2-utilities\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.734259 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbfwz\" (UniqueName: \"kubernetes.io/projected/67d20448-086a-4d76-b547-768f68c018f2-kube-api-access-pbfwz\") pod \"certified-operators-qk79m\" (UID: \"67d20448-086a-4d76-b547-768f68c018f2\") " pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:52 crc kubenswrapper[5094]: I0220 09:33:52.825044 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:33:53 crc kubenswrapper[5094]: I0220 09:33:53.277177 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qk79m"] Feb 20 09:33:53 crc kubenswrapper[5094]: I0220 09:33:53.835797 5094 generic.go:334] "Generic (PLEG): container finished" podID="67d20448-086a-4d76-b547-768f68c018f2" containerID="643c447e66c80ac8e409efce76bf116e44220b0806a0ef9462d288f6b9525e21" exitCode=0 Feb 20 09:33:53 crc kubenswrapper[5094]: I0220 09:33:53.835856 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk79m" event={"ID":"67d20448-086a-4d76-b547-768f68c018f2","Type":"ContainerDied","Data":"643c447e66c80ac8e409efce76bf116e44220b0806a0ef9462d288f6b9525e21"} Feb 20 09:33:53 crc kubenswrapper[5094]: I0220 09:33:53.835898 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk79m" event={"ID":"67d20448-086a-4d76-b547-768f68c018f2","Type":"ContainerStarted","Data":"636fac638a1c3adb897df51c5637d7ea4e67ced089cceb5902118444998e4673"} Feb 20 09:33:55 crc kubenswrapper[5094]: I0220 09:33:55.848679 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:33:55 crc kubenswrapper[5094]: E0220 09:33:55.849648 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:33:58 crc kubenswrapper[5094]: I0220 09:33:58.892224 5094 generic.go:334] "Generic (PLEG): container finished" podID="67d20448-086a-4d76-b547-768f68c018f2" containerID="88b2d7edfd9ae4aea620ef3f8c76de6d29966d07827651cec7c7fba1d9b759c9" exitCode=0 Feb 20 09:33:58 crc kubenswrapper[5094]: I0220 09:33:58.892289 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk79m" event={"ID":"67d20448-086a-4d76-b547-768f68c018f2","Type":"ContainerDied","Data":"88b2d7edfd9ae4aea620ef3f8c76de6d29966d07827651cec7c7fba1d9b759c9"} Feb 20 09:33:59 crc kubenswrapper[5094]: I0220 09:33:59.906267 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qk79m" event={"ID":"67d20448-086a-4d76-b547-768f68c018f2","Type":"ContainerStarted","Data":"6ce609b949dea07c6e6cae1376723e7c800390d20d3eaa646283825557899eac"} Feb 20 09:33:59 crc kubenswrapper[5094]: I0220 09:33:59.922343 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qk79m" podStartSLOduration=2.490177808 podStartE2EDuration="7.922321996s" podCreationTimestamp="2026-02-20 09:33:52 +0000 UTC" firstStartedPulling="2026-02-20 09:33:53.837884455 +0000 UTC m=+10048.710511166" lastFinishedPulling="2026-02-20 09:33:59.270028643 +0000 UTC m=+10054.142655354" observedRunningTime="2026-02-20 09:33:59.921461805 +0000 UTC m=+10054.794088536" watchObservedRunningTime="2026-02-20 09:33:59.922321996 +0000 UTC m=+10054.794948707" Feb 20 09:34:02 crc kubenswrapper[5094]: I0220 09:34:02.825568 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:34:02 crc kubenswrapper[5094]: I0220 09:34:02.826078 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:34:02 crc kubenswrapper[5094]: I0220 09:34:02.869508 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:34:10 crc kubenswrapper[5094]: I0220 09:34:10.840872 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:34:10 crc kubenswrapper[5094]: E0220 09:34:10.841667 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:34:12 crc kubenswrapper[5094]: I0220 09:34:12.879502 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qk79m" Feb 20 09:34:12 crc kubenswrapper[5094]: I0220 09:34:12.946677 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qk79m"] Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.000751 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.001041 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tzpc7" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="registry-server" containerID="cri-o://e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" gracePeriod=2 Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.516816 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.663059 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") pod \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.663199 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") pod \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.663272 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") pod \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\" (UID: \"c6db9ece-1aa7-4ea4-b800-b710a760edf6\") " Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.666268 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities" (OuterVolumeSpecName: "utilities") pod "c6db9ece-1aa7-4ea4-b800-b710a760edf6" (UID: "c6db9ece-1aa7-4ea4-b800-b710a760edf6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.671782 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt" (OuterVolumeSpecName: "kube-api-access-qjhpt") pod "c6db9ece-1aa7-4ea4-b800-b710a760edf6" (UID: "c6db9ece-1aa7-4ea4-b800-b710a760edf6"). InnerVolumeSpecName "kube-api-access-qjhpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.732242 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6db9ece-1aa7-4ea4-b800-b710a760edf6" (UID: "c6db9ece-1aa7-4ea4-b800-b710a760edf6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.765451 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.765483 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjhpt\" (UniqueName: \"kubernetes.io/projected/c6db9ece-1aa7-4ea4-b800-b710a760edf6-kube-api-access-qjhpt\") on node \"crc\" DevicePath \"\"" Feb 20 09:34:13 crc kubenswrapper[5094]: I0220 09:34:13.765494 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6db9ece-1aa7-4ea4-b800-b710a760edf6-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051375 5094 generic.go:334] "Generic (PLEG): container finished" podID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerID="e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" exitCode=0 Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051426 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerDied","Data":"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb"} Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051454 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tzpc7" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051477 5094 scope.go:117] "RemoveContainer" containerID="e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.051464 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tzpc7" event={"ID":"c6db9ece-1aa7-4ea4-b800-b710a760edf6","Type":"ContainerDied","Data":"b005b27266dcf7c793cadc5ec806c3eba13a4c965c0c0b82e485488383c3f93f"} Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.084365 5094 scope.go:117] "RemoveContainer" containerID="b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.091426 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.108534 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tzpc7"] Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.150763 5094 scope.go:117] "RemoveContainer" containerID="0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.189991 5094 scope.go:117] "RemoveContainer" containerID="e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" Feb 20 09:34:14 crc kubenswrapper[5094]: E0220 09:34:14.191061 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb\": container with ID starting with e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb not found: ID does not exist" containerID="e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191092 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb"} err="failed to get container status \"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb\": rpc error: code = NotFound desc = could not find container \"e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb\": container with ID starting with e09d2889270f03a1a2710e1eedb583da8696096d2d0903240186438a06d8fabb not found: ID does not exist" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191111 5094 scope.go:117] "RemoveContainer" containerID="b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc" Feb 20 09:34:14 crc kubenswrapper[5094]: E0220 09:34:14.191442 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc\": container with ID starting with b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc not found: ID does not exist" containerID="b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191462 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc"} err="failed to get container status \"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc\": rpc error: code = NotFound desc = could not find container \"b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc\": container with ID starting with b699a1c50072acbdab88a18e4209ddc7b6f1665995200477921d9b14cf4874bc not found: ID does not exist" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191474 5094 scope.go:117] "RemoveContainer" containerID="0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0" Feb 20 09:34:14 crc kubenswrapper[5094]: E0220 09:34:14.191830 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0\": container with ID starting with 0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0 not found: ID does not exist" containerID="0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0" Feb 20 09:34:14 crc kubenswrapper[5094]: I0220 09:34:14.191849 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0"} err="failed to get container status \"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0\": rpc error: code = NotFound desc = could not find container \"0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0\": container with ID starting with 0e5f06dce56f85bd90954f924e3c901b30cb5173e84b14dd5d12f82eb9ebf4a0 not found: ID does not exist" Feb 20 09:34:15 crc kubenswrapper[5094]: I0220 09:34:15.851652 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" path="/var/lib/kubelet/pods/c6db9ece-1aa7-4ea4-b800-b710a760edf6/volumes" Feb 20 09:34:25 crc kubenswrapper[5094]: I0220 09:34:25.847359 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:34:25 crc kubenswrapper[5094]: E0220 09:34:25.854815 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:34:37 crc kubenswrapper[5094]: I0220 09:34:37.841771 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:34:38 crc kubenswrapper[5094]: I0220 09:34:38.300643 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2"} Feb 20 09:35:42 crc kubenswrapper[5094]: I0220 09:35:42.989128 5094 generic.go:334] "Generic (PLEG): container finished" podID="5b30b185-0b70-4ad8-8eca-a292b76fb410" containerID="40b6b847514554d6d2915d390a68f23820ddde7e93d36e6d67b4ad209729e928" exitCode=0 Feb 20 09:35:42 crc kubenswrapper[5094]: I0220 09:35:42.989834 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" event={"ID":"5b30b185-0b70-4ad8-8eca-a292b76fb410","Type":"ContainerDied","Data":"40b6b847514554d6d2915d390a68f23820ddde7e93d36e6d67b4ad209729e928"} Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.505281 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594214 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594340 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594423 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594462 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594488 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594510 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594544 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594579 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594621 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594661 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594688 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594759 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.594807 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") pod \"5b30b185-0b70-4ad8-8eca-a292b76fb410\" (UID: \"5b30b185-0b70-4ad8-8eca-a292b76fb410\") " Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.603400 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5" (OuterVolumeSpecName: "kube-api-access-sbtc5") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "kube-api-access-sbtc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.604194 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.615155 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph" (OuterVolumeSpecName: "ceph") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.637630 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.638464 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.638505 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.643399 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory" (OuterVolumeSpecName: "inventory") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.647053 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.657142 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.658044 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.660261 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.661340 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.670728 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "5b30b185-0b70-4ad8-8eca-a292b76fb410" (UID: "5b30b185-0b70-4ad8-8eca-a292b76fb410"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697369 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697402 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697412 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697421 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697430 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbtc5\" (UniqueName: \"kubernetes.io/projected/5b30b185-0b70-4ad8-8eca-a292b76fb410-kube-api-access-sbtc5\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697439 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697447 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697457 5094 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-inventory\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697466 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697474 5094 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697482 5094 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-ceph\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697490 5094 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:44 crc kubenswrapper[5094]: I0220 09:35:44.697508 5094 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/5b30b185-0b70-4ad8-8eca-a292b76fb410-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 20 09:35:45 crc kubenswrapper[5094]: I0220 09:35:45.016738 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" event={"ID":"5b30b185-0b70-4ad8-8eca-a292b76fb410","Type":"ContainerDied","Data":"135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7"} Feb 20 09:35:45 crc kubenswrapper[5094]: I0220 09:35:45.016789 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd" Feb 20 09:35:45 crc kubenswrapper[5094]: I0220 09:35:45.016798 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="135ff20bf16d42798db0e45ebf2925fe4bfb43459a1c4911adfd22e9f5907df7" Feb 20 09:37:04 crc kubenswrapper[5094]: I0220 09:37:04.106356 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:37:04 crc kubenswrapper[5094]: I0220 09:37:04.107076 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:37:19 crc kubenswrapper[5094]: I0220 09:37:19.970258 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 09:37:19 crc kubenswrapper[5094]: I0220 09:37:19.971271 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerName="adoption" containerID="cri-o://0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0" gracePeriod=30 Feb 20 09:37:34 crc kubenswrapper[5094]: I0220 09:37:34.106578 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:37:34 crc kubenswrapper[5094]: I0220 09:37:34.107137 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.412583 5094 generic.go:334] "Generic (PLEG): container finished" podID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerID="0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0" exitCode=137 Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.412666 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38c66beb-a97f-470c-8999-e15f5c4a9b60","Type":"ContainerDied","Data":"0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0"} Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.413220 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"38c66beb-a97f-470c-8999-e15f5c4a9b60","Type":"ContainerDied","Data":"9eef613ade19e2ed2f7272eef05d7fc30f774f3cc73f115f6f763788afc1cc96"} Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.413236 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eef613ade19e2ed2f7272eef05d7fc30f774f3cc73f115f6f763788afc1cc96" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.449975 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.569872 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") pod \"38c66beb-a97f-470c-8999-e15f5c4a9b60\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.572005 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") pod \"38c66beb-a97f-470c-8999-e15f5c4a9b60\" (UID: \"38c66beb-a97f-470c-8999-e15f5c4a9b60\") " Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.577887 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf" (OuterVolumeSpecName: "kube-api-access-rvljf") pod "38c66beb-a97f-470c-8999-e15f5c4a9b60" (UID: "38c66beb-a97f-470c-8999-e15f5c4a9b60"). InnerVolumeSpecName "kube-api-access-rvljf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.590197 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7" (OuterVolumeSpecName: "mariadb-data") pod "38c66beb-a97f-470c-8999-e15f5c4a9b60" (UID: "38c66beb-a97f-470c-8999-e15f5c4a9b60"). InnerVolumeSpecName "pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.674615 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvljf\" (UniqueName: \"kubernetes.io/projected/38c66beb-a97f-470c-8999-e15f5c4a9b60-kube-api-access-rvljf\") on node \"crc\" DevicePath \"\"" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.674875 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") on node \"crc\" " Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.712908 5094 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.713311 5094 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7") on node "crc" Feb 20 09:37:50 crc kubenswrapper[5094]: I0220 09:37:50.776527 5094 reconciler_common.go:293] "Volume detached for volume \"pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-002e8bbf-16e4-4a0c-9602-e6c12782fca7\") on node \"crc\" DevicePath \"\"" Feb 20 09:37:51 crc kubenswrapper[5094]: I0220 09:37:51.421990 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Feb 20 09:37:51 crc kubenswrapper[5094]: I0220 09:37:51.469687 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 09:37:51 crc kubenswrapper[5094]: I0220 09:37:51.483343 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Feb 20 09:37:51 crc kubenswrapper[5094]: I0220 09:37:51.856680 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" path="/var/lib/kubelet/pods/38c66beb-a97f-470c-8999-e15f5c4a9b60/volumes" Feb 20 09:37:52 crc kubenswrapper[5094]: I0220 09:37:52.310293 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 09:37:52 crc kubenswrapper[5094]: I0220 09:37:52.311461 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="4111d2dd-641f-4113-8751-4151d435e934" containerName="adoption" containerID="cri-o://0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e" gracePeriod=30 Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.106902 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.108022 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.108121 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.109687 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.109812 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2" gracePeriod=600 Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.609689 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2" exitCode=0 Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.610195 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2"} Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.610236 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3"} Feb 20 09:38:04 crc kubenswrapper[5094]: I0220 09:38:04.610257 5094 scope.go:117] "RemoveContainer" containerID="88a19d411ccf38a75b3d80d3f50e5d3a5dd47df564860d92b3e23500fce855e4" Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.823181 5094 generic.go:334] "Generic (PLEG): container finished" podID="4111d2dd-641f-4113-8751-4151d435e934" containerID="0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e" exitCode=137 Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.823229 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4111d2dd-641f-4113-8751-4151d435e934","Type":"ContainerDied","Data":"0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e"} Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.823678 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"4111d2dd-641f-4113-8751-4151d435e934","Type":"ContainerDied","Data":"12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4"} Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.823689 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12052248c5408eacb24c4b1b460564df0f8c856ad57e93b0f1d03df46cc8aee4" Feb 20 09:38:22 crc kubenswrapper[5094]: I0220 09:38:22.839055 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.008526 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") pod \"4111d2dd-641f-4113-8751-4151d435e934\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.008689 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") pod \"4111d2dd-641f-4113-8751-4151d435e934\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.009613 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") pod \"4111d2dd-641f-4113-8751-4151d435e934\" (UID: \"4111d2dd-641f-4113-8751-4151d435e934\") " Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.015144 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5" (OuterVolumeSpecName: "kube-api-access-trgg5") pod "4111d2dd-641f-4113-8751-4151d435e934" (UID: "4111d2dd-641f-4113-8751-4151d435e934"). InnerVolumeSpecName "kube-api-access-trgg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.017161 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "4111d2dd-641f-4113-8751-4151d435e934" (UID: "4111d2dd-641f-4113-8751-4151d435e934"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.032385 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87" (OuterVolumeSpecName: "ovn-data") pod "4111d2dd-641f-4113-8751-4151d435e934" (UID: "4111d2dd-641f-4113-8751-4151d435e934"). InnerVolumeSpecName "pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.112267 5094 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/4111d2dd-641f-4113-8751-4151d435e934-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.112325 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") on node \"crc\" " Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.112336 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trgg5\" (UniqueName: \"kubernetes.io/projected/4111d2dd-641f-4113-8751-4151d435e934-kube-api-access-trgg5\") on node \"crc\" DevicePath \"\"" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.134522 5094 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.134693 5094 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87") on node "crc" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.215671 5094 reconciler_common.go:293] "Volume detached for volume \"pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-0be5a452-f8fb-44ce-97ed-5f65e0ad0e87\") on node \"crc\" DevicePath \"\"" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.832631 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.873044 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 09:38:23 crc kubenswrapper[5094]: I0220 09:38:23.883375 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Feb 20 09:38:25 crc kubenswrapper[5094]: I0220 09:38:25.853576 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4111d2dd-641f-4113-8751-4151d435e934" path="/var/lib/kubelet/pods/4111d2dd-641f-4113-8751-4151d435e934/volumes" Feb 20 09:38:44 crc kubenswrapper[5094]: I0220 09:38:44.935052 5094 scope.go:117] "RemoveContainer" containerID="0a962e0ed36b20eeee3760655985d477e38322eaa7ec12060bc14e41416dcf5e" Feb 20 09:38:44 crc kubenswrapper[5094]: I0220 09:38:44.993829 5094 scope.go:117] "RemoveContainer" containerID="0ce7542b1a8ed6f96aeda268ad15c227863bbd5946b3eaac7eb6134db4ce52f0" Feb 20 09:40:04 crc kubenswrapper[5094]: I0220 09:40:04.106446 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:40:04 crc kubenswrapper[5094]: I0220 09:40:04.107059 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:40:34 crc kubenswrapper[5094]: I0220 09:40:34.107154 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:40:34 crc kubenswrapper[5094]: I0220 09:40:34.107985 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.039673 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041555 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b30b185-0b70-4ad8-8eca-a292b76fb410" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041584 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b30b185-0b70-4ad8-8eca-a292b76fb410" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041608 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="extract-utilities" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041622 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="extract-utilities" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041636 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="extract-content" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041644 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="extract-content" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041665 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="registry-server" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041673 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="registry-server" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041742 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041751 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.041764 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4111d2dd-641f-4113-8751-4151d435e934" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041770 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="4111d2dd-641f-4113-8751-4151d435e934" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041966 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b30b185-0b70-4ad8-8eca-a292b76fb410" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.041987 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6db9ece-1aa7-4ea4-b800-b710a760edf6" containerName="registry-server" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.042010 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="4111d2dd-641f-4113-8751-4151d435e934" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.042022 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="38c66beb-a97f-470c-8999-e15f5c4a9b60" containerName="adoption" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.043917 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.054343 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.106762 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.106837 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.106887 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.107821 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.107884 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" gracePeriod=600 Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.228782 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.229204 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.229352 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.229424 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331014 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331136 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331172 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331546 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.331739 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.349889 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") pod \"community-operators-vkzrm\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.362903 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.663811 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" exitCode=0 Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.663854 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3"} Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.664164 5094 scope.go:117] "RemoveContainer" containerID="31a97918d224bfc5f6552272a8a0ed77ef798862ba6a87755f4f2752a98e41a2" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.665166 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:04 crc kubenswrapper[5094]: E0220 09:41:04.665612 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:04 crc kubenswrapper[5094]: I0220 09:41:04.933991 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:05 crc kubenswrapper[5094]: I0220 09:41:05.680526 5094 generic.go:334] "Generic (PLEG): container finished" podID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerID="c5346f105c3a7aa086fca7b0b87e5e8579a3e80ebf063450c83b214c198260b6" exitCode=0 Feb 20 09:41:05 crc kubenswrapper[5094]: I0220 09:41:05.680635 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerDied","Data":"c5346f105c3a7aa086fca7b0b87e5e8579a3e80ebf063450c83b214c198260b6"} Feb 20 09:41:05 crc kubenswrapper[5094]: I0220 09:41:05.680930 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerStarted","Data":"71a63e5a0ea54fcc02961c1ea8fc4c44cc227dcf955deaa310c10c70c63fbaf3"} Feb 20 09:41:05 crc kubenswrapper[5094]: I0220 09:41:05.683376 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:41:07 crc kubenswrapper[5094]: I0220 09:41:07.701316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerStarted","Data":"7af968c16caa79d5bb5f6a8736bb5e65c93aad23723c3283822323271ece7400"} Feb 20 09:41:09 crc kubenswrapper[5094]: I0220 09:41:09.727196 5094 generic.go:334] "Generic (PLEG): container finished" podID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerID="7af968c16caa79d5bb5f6a8736bb5e65c93aad23723c3283822323271ece7400" exitCode=0 Feb 20 09:41:09 crc kubenswrapper[5094]: I0220 09:41:09.727588 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerDied","Data":"7af968c16caa79d5bb5f6a8736bb5e65c93aad23723c3283822323271ece7400"} Feb 20 09:41:10 crc kubenswrapper[5094]: I0220 09:41:10.740884 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerStarted","Data":"75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5"} Feb 20 09:41:10 crc kubenswrapper[5094]: I0220 09:41:10.772372 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vkzrm" podStartSLOduration=2.315427207 podStartE2EDuration="6.772351574s" podCreationTimestamp="2026-02-20 09:41:04 +0000 UTC" firstStartedPulling="2026-02-20 09:41:05.683149634 +0000 UTC m=+10480.555776335" lastFinishedPulling="2026-02-20 09:41:10.140073981 +0000 UTC m=+10485.012700702" observedRunningTime="2026-02-20 09:41:10.763445725 +0000 UTC m=+10485.636072446" watchObservedRunningTime="2026-02-20 09:41:10.772351574 +0000 UTC m=+10485.644978295" Feb 20 09:41:14 crc kubenswrapper[5094]: I0220 09:41:14.363599 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:14 crc kubenswrapper[5094]: I0220 09:41:14.366928 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:14 crc kubenswrapper[5094]: I0220 09:41:14.433449 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:16 crc kubenswrapper[5094]: I0220 09:41:16.841531 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:16 crc kubenswrapper[5094]: E0220 09:41:16.842466 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:24 crc kubenswrapper[5094]: I0220 09:41:24.414504 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:24 crc kubenswrapper[5094]: I0220 09:41:24.457458 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:24 crc kubenswrapper[5094]: I0220 09:41:24.926765 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vkzrm" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="registry-server" containerID="cri-o://75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5" gracePeriod=2 Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.941810 5094 generic.go:334] "Generic (PLEG): container finished" podID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerID="75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5" exitCode=0 Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.941868 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerDied","Data":"75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5"} Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.942310 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkzrm" event={"ID":"f77272af-e36b-4ab1-9029-6485a5d2c95f","Type":"ContainerDied","Data":"71a63e5a0ea54fcc02961c1ea8fc4c44cc227dcf955deaa310c10c70c63fbaf3"} Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.942327 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71a63e5a0ea54fcc02961c1ea8fc4c44cc227dcf955deaa310c10c70c63fbaf3" Feb 20 09:41:25 crc kubenswrapper[5094]: I0220 09:41:25.945921 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.081421 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") pod \"f77272af-e36b-4ab1-9029-6485a5d2c95f\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.081641 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") pod \"f77272af-e36b-4ab1-9029-6485a5d2c95f\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.081725 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") pod \"f77272af-e36b-4ab1-9029-6485a5d2c95f\" (UID: \"f77272af-e36b-4ab1-9029-6485a5d2c95f\") " Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.082904 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities" (OuterVolumeSpecName: "utilities") pod "f77272af-e36b-4ab1-9029-6485a5d2c95f" (UID: "f77272af-e36b-4ab1-9029-6485a5d2c95f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.143375 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp" (OuterVolumeSpecName: "kube-api-access-dfpgp") pod "f77272af-e36b-4ab1-9029-6485a5d2c95f" (UID: "f77272af-e36b-4ab1-9029-6485a5d2c95f"). InnerVolumeSpecName "kube-api-access-dfpgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.150378 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f77272af-e36b-4ab1-9029-6485a5d2c95f" (UID: "f77272af-e36b-4ab1-9029-6485a5d2c95f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.183451 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfpgp\" (UniqueName: \"kubernetes.io/projected/f77272af-e36b-4ab1-9029-6485a5d2c95f-kube-api-access-dfpgp\") on node \"crc\" DevicePath \"\"" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.183504 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.183515 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f77272af-e36b-4ab1-9029-6485a5d2c95f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.951579 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkzrm" Feb 20 09:41:26 crc kubenswrapper[5094]: I0220 09:41:26.996332 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:27 crc kubenswrapper[5094]: I0220 09:41:27.007173 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vkzrm"] Feb 20 09:41:27 crc kubenswrapper[5094]: I0220 09:41:27.858237 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" path="/var/lib/kubelet/pods/f77272af-e36b-4ab1-9029-6485a5d2c95f/volumes" Feb 20 09:41:31 crc kubenswrapper[5094]: I0220 09:41:31.841279 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:31 crc kubenswrapper[5094]: E0220 09:41:31.842125 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:45 crc kubenswrapper[5094]: I0220 09:41:45.853282 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:45 crc kubenswrapper[5094]: E0220 09:41:45.854294 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:41:58 crc kubenswrapper[5094]: I0220 09:41:58.840829 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:41:58 crc kubenswrapper[5094]: E0220 09:41:58.841469 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:42:10 crc kubenswrapper[5094]: I0220 09:42:10.841047 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:42:10 crc kubenswrapper[5094]: E0220 09:42:10.843395 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:42:25 crc kubenswrapper[5094]: I0220 09:42:25.858510 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:42:25 crc kubenswrapper[5094]: E0220 09:42:25.859791 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:42:36 crc kubenswrapper[5094]: I0220 09:42:36.840757 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:42:36 crc kubenswrapper[5094]: E0220 09:42:36.842819 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:42:51 crc kubenswrapper[5094]: I0220 09:42:51.840791 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:42:51 crc kubenswrapper[5094]: E0220 09:42:51.841855 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.765821 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:02 crc kubenswrapper[5094]: E0220 09:43:02.766789 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="extract-utilities" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.766803 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="extract-utilities" Feb 20 09:43:02 crc kubenswrapper[5094]: E0220 09:43:02.766822 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="extract-content" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.766828 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="extract-content" Feb 20 09:43:02 crc kubenswrapper[5094]: E0220 09:43:02.766843 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="registry-server" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.766850 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="registry-server" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.767041 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="f77272af-e36b-4ab1-9029-6485a5d2c95f" containerName="registry-server" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.768429 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.784222 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.872698 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.872792 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.872827 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.974619 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.975610 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.975816 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.976042 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.976335 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:02 crc kubenswrapper[5094]: I0220 09:43:02.995486 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") pod \"redhat-marketplace-jhzc2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:03 crc kubenswrapper[5094]: I0220 09:43:03.089800 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:03 crc kubenswrapper[5094]: I0220 09:43:03.557084 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:03 crc kubenswrapper[5094]: I0220 09:43:03.841203 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:03 crc kubenswrapper[5094]: E0220 09:43:03.841905 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:04 crc kubenswrapper[5094]: I0220 09:43:04.192527 5094 generic.go:334] "Generic (PLEG): container finished" podID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerID="2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5" exitCode=0 Feb 20 09:43:04 crc kubenswrapper[5094]: I0220 09:43:04.192571 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerDied","Data":"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5"} Feb 20 09:43:04 crc kubenswrapper[5094]: I0220 09:43:04.192595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerStarted","Data":"19574da3fb2273ee9563d8d3859e607ecb55ce1e9e72fd86b08f90594aef0a9e"} Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.191043 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.193619 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.219223 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerStarted","Data":"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25"} Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.220544 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.325576 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.325644 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.325666 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.427350 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.427436 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.427468 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.427909 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.428255 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.449424 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") pod \"redhat-operators-kc77r\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:05 crc kubenswrapper[5094]: I0220 09:43:05.512152 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:06 crc kubenswrapper[5094]: I0220 09:43:06.151978 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:06 crc kubenswrapper[5094]: I0220 09:43:06.235909 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerStarted","Data":"0e99f289f884916a03a641feb2a0bffd99bb555f0d1d3483e2c7822f04a8fa4b"} Feb 20 09:43:07 crc kubenswrapper[5094]: I0220 09:43:07.248024 5094 generic.go:334] "Generic (PLEG): container finished" podID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerID="4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25" exitCode=0 Feb 20 09:43:07 crc kubenswrapper[5094]: I0220 09:43:07.248106 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerDied","Data":"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25"} Feb 20 09:43:07 crc kubenswrapper[5094]: I0220 09:43:07.250118 5094 generic.go:334] "Generic (PLEG): container finished" podID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerID="9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0" exitCode=0 Feb 20 09:43:07 crc kubenswrapper[5094]: I0220 09:43:07.250158 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerDied","Data":"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0"} Feb 20 09:43:08 crc kubenswrapper[5094]: I0220 09:43:08.261019 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerStarted","Data":"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab"} Feb 20 09:43:08 crc kubenswrapper[5094]: I0220 09:43:08.278428 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jhzc2" podStartSLOduration=2.677876233 podStartE2EDuration="6.27841036s" podCreationTimestamp="2026-02-20 09:43:02 +0000 UTC" firstStartedPulling="2026-02-20 09:43:04.194102718 +0000 UTC m=+10599.066729419" lastFinishedPulling="2026-02-20 09:43:07.794636845 +0000 UTC m=+10602.667263546" observedRunningTime="2026-02-20 09:43:08.276909394 +0000 UTC m=+10603.149536115" watchObservedRunningTime="2026-02-20 09:43:08.27841036 +0000 UTC m=+10603.151037071" Feb 20 09:43:09 crc kubenswrapper[5094]: I0220 09:43:09.283616 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerStarted","Data":"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0"} Feb 20 09:43:13 crc kubenswrapper[5094]: I0220 09:43:13.090150 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:13 crc kubenswrapper[5094]: I0220 09:43:13.090827 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:14 crc kubenswrapper[5094]: I0220 09:43:14.338557 5094 generic.go:334] "Generic (PLEG): container finished" podID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerID="d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0" exitCode=0 Feb 20 09:43:14 crc kubenswrapper[5094]: I0220 09:43:14.338770 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerDied","Data":"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0"} Feb 20 09:43:14 crc kubenswrapper[5094]: I0220 09:43:14.686919 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-jhzc2" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" probeResult="failure" output=< Feb 20 09:43:14 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:43:14 crc kubenswrapper[5094]: > Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.350000 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerStarted","Data":"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321"} Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.378520 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kc77r" podStartSLOduration=2.903414118 podStartE2EDuration="10.378500871s" podCreationTimestamp="2026-02-20 09:43:05 +0000 UTC" firstStartedPulling="2026-02-20 09:43:07.258918206 +0000 UTC m=+10602.131544927" lastFinishedPulling="2026-02-20 09:43:14.734004959 +0000 UTC m=+10609.606631680" observedRunningTime="2026-02-20 09:43:15.372494034 +0000 UTC m=+10610.245120745" watchObservedRunningTime="2026-02-20 09:43:15.378500871 +0000 UTC m=+10610.251127592" Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.513344 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.513415 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:15 crc kubenswrapper[5094]: I0220 09:43:15.846611 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:15 crc kubenswrapper[5094]: E0220 09:43:15.846898 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:16 crc kubenswrapper[5094]: I0220 09:43:16.566124 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kc77r" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" probeResult="failure" output=< Feb 20 09:43:16 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:43:16 crc kubenswrapper[5094]: > Feb 20 09:43:23 crc kubenswrapper[5094]: I0220 09:43:23.144954 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:24 crc kubenswrapper[5094]: I0220 09:43:24.069050 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:24 crc kubenswrapper[5094]: I0220 09:43:24.120468 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:24 crc kubenswrapper[5094]: I0220 09:43:24.436288 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jhzc2" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" containerID="cri-o://060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" gracePeriod=2 Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.000937 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.027223 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") pod \"32d8687f-685c-44e3-866d-2f5f1eb289e2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.027512 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") pod \"32d8687f-685c-44e3-866d-2f5f1eb289e2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.027552 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") pod \"32d8687f-685c-44e3-866d-2f5f1eb289e2\" (UID: \"32d8687f-685c-44e3-866d-2f5f1eb289e2\") " Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.028363 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities" (OuterVolumeSpecName: "utilities") pod "32d8687f-685c-44e3-866d-2f5f1eb289e2" (UID: "32d8687f-685c-44e3-866d-2f5f1eb289e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.034588 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg" (OuterVolumeSpecName: "kube-api-access-kz2cg") pod "32d8687f-685c-44e3-866d-2f5f1eb289e2" (UID: "32d8687f-685c-44e3-866d-2f5f1eb289e2"). InnerVolumeSpecName "kube-api-access-kz2cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.057513 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32d8687f-685c-44e3-866d-2f5f1eb289e2" (UID: "32d8687f-685c-44e3-866d-2f5f1eb289e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.130634 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.130698 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32d8687f-685c-44e3-866d-2f5f1eb289e2-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.130743 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz2cg\" (UniqueName: \"kubernetes.io/projected/32d8687f-685c-44e3-866d-2f5f1eb289e2-kube-api-access-kz2cg\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447042 5094 generic.go:334] "Generic (PLEG): container finished" podID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerID="060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" exitCode=0 Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447087 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerDied","Data":"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab"} Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jhzc2" event={"ID":"32d8687f-685c-44e3-866d-2f5f1eb289e2","Type":"ContainerDied","Data":"19574da3fb2273ee9563d8d3859e607ecb55ce1e9e72fd86b08f90594aef0a9e"} Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447141 5094 scope.go:117] "RemoveContainer" containerID="060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.447163 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jhzc2" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.468506 5094 scope.go:117] "RemoveContainer" containerID="4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.494851 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.504589 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jhzc2"] Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.663595 5094 scope.go:117] "RemoveContainer" containerID="2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.745579 5094 scope.go:117] "RemoveContainer" containerID="060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" Feb 20 09:43:25 crc kubenswrapper[5094]: E0220 09:43:25.746687 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab\": container with ID starting with 060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab not found: ID does not exist" containerID="060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.746762 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab"} err="failed to get container status \"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab\": rpc error: code = NotFound desc = could not find container \"060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab\": container with ID starting with 060b3efae204d4906112a48a27f9459a57d1fc7a2f6cbfe75aa69dc1876919ab not found: ID does not exist" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.746795 5094 scope.go:117] "RemoveContainer" containerID="4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25" Feb 20 09:43:25 crc kubenswrapper[5094]: E0220 09:43:25.747293 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25\": container with ID starting with 4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25 not found: ID does not exist" containerID="4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.747327 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25"} err="failed to get container status \"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25\": rpc error: code = NotFound desc = could not find container \"4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25\": container with ID starting with 4f0b0b106fd6f1617a10c750586039348c94f17c58bdd125e394ec6e270c5e25 not found: ID does not exist" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.747350 5094 scope.go:117] "RemoveContainer" containerID="2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5" Feb 20 09:43:25 crc kubenswrapper[5094]: E0220 09:43:25.747601 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5\": container with ID starting with 2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5 not found: ID does not exist" containerID="2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.747630 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5"} err="failed to get container status \"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5\": rpc error: code = NotFound desc = could not find container \"2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5\": container with ID starting with 2f1931458d437e43cbd8bc95c262d54ee9ce7bd1a90e96c0ff847beb2e9b6bd5 not found: ID does not exist" Feb 20 09:43:25 crc kubenswrapper[5094]: I0220 09:43:25.854009 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" path="/var/lib/kubelet/pods/32d8687f-685c-44e3-866d-2f5f1eb289e2/volumes" Feb 20 09:43:26 crc kubenswrapper[5094]: I0220 09:43:26.691789 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kc77r" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" probeResult="failure" output=< Feb 20 09:43:26 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:43:26 crc kubenswrapper[5094]: > Feb 20 09:43:26 crc kubenswrapper[5094]: I0220 09:43:26.840830 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:26 crc kubenswrapper[5094]: E0220 09:43:26.841283 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:36 crc kubenswrapper[5094]: I0220 09:43:36.558301 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kc77r" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" probeResult="failure" output=< Feb 20 09:43:36 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:43:36 crc kubenswrapper[5094]: > Feb 20 09:43:38 crc kubenswrapper[5094]: I0220 09:43:38.840569 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:38 crc kubenswrapper[5094]: E0220 09:43:38.841377 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:43:45 crc kubenswrapper[5094]: I0220 09:43:45.571957 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:45 crc kubenswrapper[5094]: I0220 09:43:45.648648 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:45 crc kubenswrapper[5094]: I0220 09:43:45.815448 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:46 crc kubenswrapper[5094]: I0220 09:43:46.692038 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kc77r" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" containerID="cri-o://186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" gracePeriod=2 Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.214684 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.284076 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") pod \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.284642 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") pod \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.284905 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") pod \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\" (UID: \"d8c7e904-200a-45ed-aaa1-cd6bf6c71399\") " Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.287727 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities" (OuterVolumeSpecName: "utilities") pod "d8c7e904-200a-45ed-aaa1-cd6bf6c71399" (UID: "d8c7e904-200a-45ed-aaa1-cd6bf6c71399"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.299280 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n" (OuterVolumeSpecName: "kube-api-access-jqw9n") pod "d8c7e904-200a-45ed-aaa1-cd6bf6c71399" (UID: "d8c7e904-200a-45ed-aaa1-cd6bf6c71399"). InnerVolumeSpecName "kube-api-access-jqw9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.388800 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqw9n\" (UniqueName: \"kubernetes.io/projected/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-kube-api-access-jqw9n\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.389159 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.432244 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8c7e904-200a-45ed-aaa1-cd6bf6c71399" (UID: "d8c7e904-200a-45ed-aaa1-cd6bf6c71399"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.491310 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8c7e904-200a-45ed-aaa1-cd6bf6c71399-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.703870 5094 generic.go:334] "Generic (PLEG): container finished" podID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerID="186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" exitCode=0 Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.703926 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerDied","Data":"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321"} Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.703958 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kc77r" event={"ID":"d8c7e904-200a-45ed-aaa1-cd6bf6c71399","Type":"ContainerDied","Data":"0e99f289f884916a03a641feb2a0bffd99bb555f0d1d3483e2c7822f04a8fa4b"} Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.703958 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kc77r" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.704047 5094 scope.go:117] "RemoveContainer" containerID="186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.742677 5094 scope.go:117] "RemoveContainer" containerID="d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.742682 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.752080 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kc77r"] Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.771965 5094 scope.go:117] "RemoveContainer" containerID="9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.815522 5094 scope.go:117] "RemoveContainer" containerID="186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" Feb 20 09:43:47 crc kubenswrapper[5094]: E0220 09:43:47.815892 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321\": container with ID starting with 186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321 not found: ID does not exist" containerID="186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.815938 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321"} err="failed to get container status \"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321\": rpc error: code = NotFound desc = could not find container \"186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321\": container with ID starting with 186c26df9cf23d1d26fa3ebea038f1baebef58766ecdcf8d81b0a71c3ca51321 not found: ID does not exist" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.815968 5094 scope.go:117] "RemoveContainer" containerID="d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0" Feb 20 09:43:47 crc kubenswrapper[5094]: E0220 09:43:47.816622 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0\": container with ID starting with d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0 not found: ID does not exist" containerID="d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.816742 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0"} err="failed to get container status \"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0\": rpc error: code = NotFound desc = could not find container \"d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0\": container with ID starting with d29a2dc3f8c45c9dac09b075bbcc08642c86a1e55b6bc87ee2ee5870dcbcf2f0 not found: ID does not exist" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.816821 5094 scope.go:117] "RemoveContainer" containerID="9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0" Feb 20 09:43:47 crc kubenswrapper[5094]: E0220 09:43:47.818351 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0\": container with ID starting with 9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0 not found: ID does not exist" containerID="9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.818379 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0"} err="failed to get container status \"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0\": rpc error: code = NotFound desc = could not find container \"9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0\": container with ID starting with 9b09c11b51a431f98ac5ba6e5d43378a44dac84a11dcdb9bf7d45440a3c402d0 not found: ID does not exist" Feb 20 09:43:47 crc kubenswrapper[5094]: I0220 09:43:47.852503 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" path="/var/lib/kubelet/pods/d8c7e904-200a-45ed-aaa1-cd6bf6c71399/volumes" Feb 20 09:43:49 crc kubenswrapper[5094]: I0220 09:43:49.840324 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:43:49 crc kubenswrapper[5094]: E0220 09:43:49.841270 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:04 crc kubenswrapper[5094]: I0220 09:44:04.842057 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:04 crc kubenswrapper[5094]: E0220 09:44:04.842981 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:16 crc kubenswrapper[5094]: I0220 09:44:16.840897 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:16 crc kubenswrapper[5094]: E0220 09:44:16.842007 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:31 crc kubenswrapper[5094]: I0220 09:44:31.840580 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:31 crc kubenswrapper[5094]: E0220 09:44:31.842048 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:45 crc kubenswrapper[5094]: I0220 09:44:45.852932 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:45 crc kubenswrapper[5094]: E0220 09:44:45.853768 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.752743 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754060 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="extract-content" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754084 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="extract-content" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754120 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754134 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754164 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="extract-content" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754178 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="extract-content" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754224 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="extract-utilities" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754242 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="extract-utilities" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754265 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754277 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: E0220 09:44:55.754318 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="extract-utilities" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.754333 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="extract-utilities" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.755510 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8c7e904-200a-45ed-aaa1-cd6bf6c71399" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.755555 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d8687f-685c-44e3-866d-2f5f1eb289e2" containerName="registry-server" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.758559 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.773086 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.822052 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.822146 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.822200 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.924260 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.924790 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.924796 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.924875 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.925385 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:55 crc kubenswrapper[5094]: I0220 09:44:55.949165 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") pod \"certified-operators-wzmgc\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:56 crc kubenswrapper[5094]: I0220 09:44:56.085630 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:44:56 crc kubenswrapper[5094]: I0220 09:44:56.613910 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:44:57 crc kubenswrapper[5094]: I0220 09:44:57.512157 5094 generic.go:334] "Generic (PLEG): container finished" podID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerID="f8828c57a2b8ec7317d67d2a0bf9858177944fa933e6d8df847de341eeb0cb4b" exitCode=0 Feb 20 09:44:57 crc kubenswrapper[5094]: I0220 09:44:57.512250 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerDied","Data":"f8828c57a2b8ec7317d67d2a0bf9858177944fa933e6d8df847de341eeb0cb4b"} Feb 20 09:44:57 crc kubenswrapper[5094]: I0220 09:44:57.513452 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerStarted","Data":"00b97c75da7493ed5a9472f088e87955e8d31c3a5e258ff8fda53af82feaa89a"} Feb 20 09:44:58 crc kubenswrapper[5094]: I0220 09:44:58.524397 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerStarted","Data":"f5dac48e57d264ca97f7999f084ce806acbde6bbc09182550bc88626fa4584c3"} Feb 20 09:44:59 crc kubenswrapper[5094]: I0220 09:44:59.538523 5094 generic.go:334] "Generic (PLEG): container finished" podID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerID="f5dac48e57d264ca97f7999f084ce806acbde6bbc09182550bc88626fa4584c3" exitCode=0 Feb 20 09:44:59 crc kubenswrapper[5094]: I0220 09:44:59.538573 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerDied","Data":"f5dac48e57d264ca97f7999f084ce806acbde6bbc09182550bc88626fa4584c3"} Feb 20 09:44:59 crc kubenswrapper[5094]: I0220 09:44:59.841360 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:44:59 crc kubenswrapper[5094]: E0220 09:44:59.842233 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.180835 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg"] Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.182534 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.184669 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.186315 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.199385 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg"] Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.217453 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.217509 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.217622 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.319888 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.319961 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.320111 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.320829 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.337438 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.343041 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") pod \"collect-profiles-29526345-k9rhg\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.505855 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.558475 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerStarted","Data":"270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241"} Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.592921 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wzmgc" podStartSLOduration=3.180805014 podStartE2EDuration="5.592899038s" podCreationTimestamp="2026-02-20 09:44:55 +0000 UTC" firstStartedPulling="2026-02-20 09:44:57.515117902 +0000 UTC m=+10712.387744613" lastFinishedPulling="2026-02-20 09:44:59.927211926 +0000 UTC m=+10714.799838637" observedRunningTime="2026-02-20 09:45:00.590163141 +0000 UTC m=+10715.462789872" watchObservedRunningTime="2026-02-20 09:45:00.592899038 +0000 UTC m=+10715.465525749" Feb 20 09:45:00 crc kubenswrapper[5094]: I0220 09:45:00.960107 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg"] Feb 20 09:45:01 crc kubenswrapper[5094]: I0220 09:45:01.567533 5094 generic.go:334] "Generic (PLEG): container finished" podID="6226166e-0c34-4bcc-a689-160cc6141fd2" containerID="7f687d9b94feb89c77fd301d62419bc2a64a6c336bb3c4b73ccdb4bc555200cb" exitCode=0 Feb 20 09:45:01 crc kubenswrapper[5094]: I0220 09:45:01.567599 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" event={"ID":"6226166e-0c34-4bcc-a689-160cc6141fd2","Type":"ContainerDied","Data":"7f687d9b94feb89c77fd301d62419bc2a64a6c336bb3c4b73ccdb4bc555200cb"} Feb 20 09:45:01 crc kubenswrapper[5094]: I0220 09:45:01.567987 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" event={"ID":"6226166e-0c34-4bcc-a689-160cc6141fd2","Type":"ContainerStarted","Data":"514e7f34ca8fdc23ec75b68771b5e24c6ed0e5ae53f1402417ec6f9a03d88da2"} Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.946337 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.975645 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") pod \"6226166e-0c34-4bcc-a689-160cc6141fd2\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.975766 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") pod \"6226166e-0c34-4bcc-a689-160cc6141fd2\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.975960 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") pod \"6226166e-0c34-4bcc-a689-160cc6141fd2\" (UID: \"6226166e-0c34-4bcc-a689-160cc6141fd2\") " Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.976419 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume" (OuterVolumeSpecName: "config-volume") pod "6226166e-0c34-4bcc-a689-160cc6141fd2" (UID: "6226166e-0c34-4bcc-a689-160cc6141fd2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.982192 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5" (OuterVolumeSpecName: "kube-api-access-jhpd5") pod "6226166e-0c34-4bcc-a689-160cc6141fd2" (UID: "6226166e-0c34-4bcc-a689-160cc6141fd2"). InnerVolumeSpecName "kube-api-access-jhpd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:45:02 crc kubenswrapper[5094]: I0220 09:45:02.987014 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6226166e-0c34-4bcc-a689-160cc6141fd2" (UID: "6226166e-0c34-4bcc-a689-160cc6141fd2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.078168 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6226166e-0c34-4bcc-a689-160cc6141fd2-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.078480 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6226166e-0c34-4bcc-a689-160cc6141fd2-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.078493 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhpd5\" (UniqueName: \"kubernetes.io/projected/6226166e-0c34-4bcc-a689-160cc6141fd2-kube-api-access-jhpd5\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.589390 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" event={"ID":"6226166e-0c34-4bcc-a689-160cc6141fd2","Type":"ContainerDied","Data":"514e7f34ca8fdc23ec75b68771b5e24c6ed0e5ae53f1402417ec6f9a03d88da2"} Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.589430 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="514e7f34ca8fdc23ec75b68771b5e24c6ed0e5ae53f1402417ec6f9a03d88da2" Feb 20 09:45:03 crc kubenswrapper[5094]: I0220 09:45:03.589483 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526345-k9rhg" Feb 20 09:45:04 crc kubenswrapper[5094]: I0220 09:45:04.030598 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:45:04 crc kubenswrapper[5094]: I0220 09:45:04.040246 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526300-bqm8l"] Feb 20 09:45:05 crc kubenswrapper[5094]: I0220 09:45:05.854888 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30508b7a-ac76-48d8-822c-65a32552ca80" path="/var/lib/kubelet/pods/30508b7a-ac76-48d8-822c-65a32552ca80/volumes" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.086142 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.086220 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.167202 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.665053 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:06 crc kubenswrapper[5094]: I0220 09:45:06.727570 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:45:08 crc kubenswrapper[5094]: I0220 09:45:08.637111 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wzmgc" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="registry-server" containerID="cri-o://270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241" gracePeriod=2 Feb 20 09:45:09 crc kubenswrapper[5094]: I0220 09:45:09.647964 5094 generic.go:334] "Generic (PLEG): container finished" podID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerID="270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241" exitCode=0 Feb 20 09:45:09 crc kubenswrapper[5094]: I0220 09:45:09.648284 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerDied","Data":"270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241"} Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.078697 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.151958 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") pod \"858ceda4-5973-45c6-8eed-2ae8f1da9129\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.152125 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") pod \"858ceda4-5973-45c6-8eed-2ae8f1da9129\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.152215 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") pod \"858ceda4-5973-45c6-8eed-2ae8f1da9129\" (UID: \"858ceda4-5973-45c6-8eed-2ae8f1da9129\") " Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.153007 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities" (OuterVolumeSpecName: "utilities") pod "858ceda4-5973-45c6-8eed-2ae8f1da9129" (UID: "858ceda4-5973-45c6-8eed-2ae8f1da9129"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.158027 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr" (OuterVolumeSpecName: "kube-api-access-dgnjr") pod "858ceda4-5973-45c6-8eed-2ae8f1da9129" (UID: "858ceda4-5973-45c6-8eed-2ae8f1da9129"). InnerVolumeSpecName "kube-api-access-dgnjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.202785 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "858ceda4-5973-45c6-8eed-2ae8f1da9129" (UID: "858ceda4-5973-45c6-8eed-2ae8f1da9129"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.254774 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.254815 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/858ceda4-5973-45c6-8eed-2ae8f1da9129-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.254830 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgnjr\" (UniqueName: \"kubernetes.io/projected/858ceda4-5973-45c6-8eed-2ae8f1da9129-kube-api-access-dgnjr\") on node \"crc\" DevicePath \"\"" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.660215 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wzmgc" event={"ID":"858ceda4-5973-45c6-8eed-2ae8f1da9129","Type":"ContainerDied","Data":"00b97c75da7493ed5a9472f088e87955e8d31c3a5e258ff8fda53af82feaa89a"} Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.660288 5094 scope.go:117] "RemoveContainer" containerID="270d662de4a3565bb7d20d3d8f4ee3707451e243e6a69d02a0a43ca78b68e241" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.660295 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wzmgc" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.686639 5094 scope.go:117] "RemoveContainer" containerID="f5dac48e57d264ca97f7999f084ce806acbde6bbc09182550bc88626fa4584c3" Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.704640 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.726125 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wzmgc"] Feb 20 09:45:10 crc kubenswrapper[5094]: I0220 09:45:10.732134 5094 scope.go:117] "RemoveContainer" containerID="f8828c57a2b8ec7317d67d2a0bf9858177944fa933e6d8df847de341eeb0cb4b" Feb 20 09:45:11 crc kubenswrapper[5094]: I0220 09:45:11.861429 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" path="/var/lib/kubelet/pods/858ceda4-5973-45c6-8eed-2ae8f1da9129/volumes" Feb 20 09:45:14 crc kubenswrapper[5094]: I0220 09:45:14.841215 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:45:14 crc kubenswrapper[5094]: E0220 09:45:14.842093 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:45:27 crc kubenswrapper[5094]: I0220 09:45:27.842468 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:45:27 crc kubenswrapper[5094]: E0220 09:45:27.843793 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:45:41 crc kubenswrapper[5094]: I0220 09:45:41.840513 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:45:41 crc kubenswrapper[5094]: E0220 09:45:41.841464 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:45:45 crc kubenswrapper[5094]: I0220 09:45:45.287785 5094 scope.go:117] "RemoveContainer" containerID="9dd7ec3da040b20e94b1ef4ad1e6147baa04fe89a2ea1cd7d18c7a1def8587f9" Feb 20 09:45:54 crc kubenswrapper[5094]: I0220 09:45:54.840479 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:45:54 crc kubenswrapper[5094]: E0220 09:45:54.841306 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:46:07 crc kubenswrapper[5094]: I0220 09:46:07.841160 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:46:09 crc kubenswrapper[5094]: I0220 09:46:09.608719 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8"} Feb 20 09:47:45 crc kubenswrapper[5094]: I0220 09:47:45.422940 5094 scope.go:117] "RemoveContainer" containerID="7af968c16caa79d5bb5f6a8736bb5e65c93aad23723c3283822323271ece7400" Feb 20 09:47:45 crc kubenswrapper[5094]: I0220 09:47:45.461241 5094 scope.go:117] "RemoveContainer" containerID="c5346f105c3a7aa086fca7b0b87e5e8579a3e80ebf063450c83b214c198260b6" Feb 20 09:47:45 crc kubenswrapper[5094]: I0220 09:47:45.529810 5094 scope.go:117] "RemoveContainer" containerID="75d574b6e0f2f2f5d758f86e35fb24ddf85408817e99d01d1c91bc0b34a2b3e5" Feb 20 09:48:34 crc kubenswrapper[5094]: I0220 09:48:34.106880 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:48:34 crc kubenswrapper[5094]: I0220 09:48:34.107534 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.327835 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 20 09:48:38 crc kubenswrapper[5094]: E0220 09:48:38.328734 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="extract-utilities" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.328750 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="extract-utilities" Feb 20 09:48:38 crc kubenswrapper[5094]: E0220 09:48:38.328781 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6226166e-0c34-4bcc-a689-160cc6141fd2" containerName="collect-profiles" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.328787 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="6226166e-0c34-4bcc-a689-160cc6141fd2" containerName="collect-profiles" Feb 20 09:48:38 crc kubenswrapper[5094]: E0220 09:48:38.328819 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="registry-server" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.328827 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="registry-server" Feb 20 09:48:38 crc kubenswrapper[5094]: E0220 09:48:38.328842 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="extract-content" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.328849 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="extract-content" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.329044 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="6226166e-0c34-4bcc-a689-160cc6141fd2" containerName="collect-profiles" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.329069 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="858ceda4-5973-45c6-8eed-2ae8f1da9129" containerName="registry-server" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.329882 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.332095 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-d88p7" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.332683 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.333336 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.334878 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.342673 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.414492 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.415092 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.415329 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.415558 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.415815 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.416040 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.416223 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.416412 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.416749 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.518599 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.518687 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.518983 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519048 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519131 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519191 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519232 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519425 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.519905 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.520463 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.521375 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.521806 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.523911 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.527487 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.530755 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.536357 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.540915 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.579428 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " pod="openstack/tempest-tests-tempest" Feb 20 09:48:38 crc kubenswrapper[5094]: I0220 09:48:38.657034 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 09:48:39 crc kubenswrapper[5094]: I0220 09:48:39.191751 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 20 09:48:39 crc kubenswrapper[5094]: I0220 09:48:39.207252 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:48:39 crc kubenswrapper[5094]: I0220 09:48:39.414161 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9","Type":"ContainerStarted","Data":"e3fe55f15912750e6ef07e8fa7b4632b5f9782d82892810f96924fcbf7aff5a8"} Feb 20 09:49:04 crc kubenswrapper[5094]: I0220 09:49:04.107382 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:49:04 crc kubenswrapper[5094]: I0220 09:49:04.108058 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:49:29 crc kubenswrapper[5094]: E0220 09:49:29.492876 5094 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:f0473f3e18dd17d7021c02e991298923" Feb 20 09:49:29 crc kubenswrapper[5094]: E0220 09:49:29.493509 5094 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:f0473f3e18dd17d7021c02e991298923" Feb 20 09:49:29 crc kubenswrapper[5094]: E0220 09:49:29.493842 5094 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:f0473f3e18dd17d7021c02e991298923,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v7j29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(8e2aa894-2a09-4fad-bcc7-1f259ca48ac9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 20 09:49:29 crc kubenswrapper[5094]: E0220 09:49:29.495099 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" Feb 20 09:49:30 crc kubenswrapper[5094]: E0220 09:49:30.292645 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-antelope-centos9/openstack-tempest-all:f0473f3e18dd17d7021c02e991298923\\\"\"" pod="openstack/tempest-tests-tempest" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.106903 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.107627 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.107690 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.108980 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.109091 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8" gracePeriod=600 Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.333781 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8" exitCode=0 Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.333940 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8"} Feb 20 09:49:34 crc kubenswrapper[5094]: I0220 09:49:34.334121 5094 scope.go:117] "RemoveContainer" containerID="7de2062cf6acc77a9fb430b4afee2caedc8591c7103c02e27ddd2224ff1a60f3" Feb 20 09:49:35 crc kubenswrapper[5094]: I0220 09:49:35.344291 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1"} Feb 20 09:49:46 crc kubenswrapper[5094]: I0220 09:49:46.086425 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 20 09:49:47 crc kubenswrapper[5094]: I0220 09:49:47.471435 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9","Type":"ContainerStarted","Data":"2c94eb29c8d1a5480a6899dd7732430e17544eb4ff0b06be93fdd212c2a48558"} Feb 20 09:49:47 crc kubenswrapper[5094]: I0220 09:49:47.496101 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.620296081 podStartE2EDuration="1m10.496078753s" podCreationTimestamp="2026-02-20 09:48:37 +0000 UTC" firstStartedPulling="2026-02-20 09:48:39.206974034 +0000 UTC m=+10934.079600745" lastFinishedPulling="2026-02-20 09:49:46.082756666 +0000 UTC m=+11000.955383417" observedRunningTime="2026-02-20 09:49:47.492377513 +0000 UTC m=+11002.365004264" watchObservedRunningTime="2026-02-20 09:49:47.496078753 +0000 UTC m=+11002.368705504" Feb 20 09:51:34 crc kubenswrapper[5094]: I0220 09:51:34.107225 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:51:34 crc kubenswrapper[5094]: I0220 09:51:34.107852 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:52:04 crc kubenswrapper[5094]: I0220 09:52:04.107017 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:52:04 crc kubenswrapper[5094]: I0220 09:52:04.107807 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:52:23 crc kubenswrapper[5094]: I0220 09:52:23.989729 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.006337 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.006439 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.182300 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.182943 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.183155 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285116 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285230 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285291 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285826 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.285835 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.306299 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") pod \"community-operators-vsfp4\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:24 crc kubenswrapper[5094]: I0220 09:52:24.334551 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:25 crc kubenswrapper[5094]: I0220 09:52:25.197124 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:26 crc kubenswrapper[5094]: I0220 09:52:26.118799 5094 generic.go:334] "Generic (PLEG): container finished" podID="efc939c9-c470-49ad-aa1f-cef315df8594" containerID="27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9" exitCode=0 Feb 20 09:52:26 crc kubenswrapper[5094]: I0220 09:52:26.118882 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerDied","Data":"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9"} Feb 20 09:52:26 crc kubenswrapper[5094]: I0220 09:52:26.119267 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerStarted","Data":"d6f7074d93aaef589bc935f694e2d12082a5a977dd0619fbfa03bcb7292208bb"} Feb 20 09:52:27 crc kubenswrapper[5094]: I0220 09:52:27.129908 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerStarted","Data":"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c"} Feb 20 09:52:29 crc kubenswrapper[5094]: I0220 09:52:29.146559 5094 generic.go:334] "Generic (PLEG): container finished" podID="efc939c9-c470-49ad-aa1f-cef315df8594" containerID="a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c" exitCode=0 Feb 20 09:52:29 crc kubenswrapper[5094]: I0220 09:52:29.146640 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerDied","Data":"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c"} Feb 20 09:52:30 crc kubenswrapper[5094]: I0220 09:52:30.165187 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerStarted","Data":"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153"} Feb 20 09:52:30 crc kubenswrapper[5094]: I0220 09:52:30.195994 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vsfp4" podStartSLOduration=3.79358038 podStartE2EDuration="7.195974585s" podCreationTimestamp="2026-02-20 09:52:23 +0000 UTC" firstStartedPulling="2026-02-20 09:52:26.120539439 +0000 UTC m=+11160.993166150" lastFinishedPulling="2026-02-20 09:52:29.522933634 +0000 UTC m=+11164.395560355" observedRunningTime="2026-02-20 09:52:30.184427081 +0000 UTC m=+11165.057053792" watchObservedRunningTime="2026-02-20 09:52:30.195974585 +0000 UTC m=+11165.068601286" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.107198 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.107582 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.107623 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.108272 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.108320 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" gracePeriod=600 Feb 20 09:52:34 crc kubenswrapper[5094]: E0220 09:52:34.235142 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.335643 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.335715 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:34 crc kubenswrapper[5094]: I0220 09:52:34.391485 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.218963 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" exitCode=0 Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.219067 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1"} Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.219301 5094 scope.go:117] "RemoveContainer" containerID="02670694c17a296cbc1327519023ef00691c5e296f13bbf2b5914297f2d0efd8" Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.220901 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:52:35 crc kubenswrapper[5094]: E0220 09:52:35.221390 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.293187 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:35 crc kubenswrapper[5094]: I0220 09:52:35.350114 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:37 crc kubenswrapper[5094]: I0220 09:52:37.238718 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vsfp4" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="registry-server" containerID="cri-o://1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" gracePeriod=2 Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.035270 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.191441 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") pod \"efc939c9-c470-49ad-aa1f-cef315df8594\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.191657 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") pod \"efc939c9-c470-49ad-aa1f-cef315df8594\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.191743 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") pod \"efc939c9-c470-49ad-aa1f-cef315df8594\" (UID: \"efc939c9-c470-49ad-aa1f-cef315df8594\") " Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.192639 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities" (OuterVolumeSpecName: "utilities") pod "efc939c9-c470-49ad-aa1f-cef315df8594" (UID: "efc939c9-c470-49ad-aa1f-cef315df8594"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.199889 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct" (OuterVolumeSpecName: "kube-api-access-4g4ct") pod "efc939c9-c470-49ad-aa1f-cef315df8594" (UID: "efc939c9-c470-49ad-aa1f-cef315df8594"). InnerVolumeSpecName "kube-api-access-4g4ct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.252972 5094 generic.go:334] "Generic (PLEG): container finished" podID="efc939c9-c470-49ad-aa1f-cef315df8594" containerID="1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" exitCode=0 Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.253037 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vsfp4" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.253053 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerDied","Data":"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153"} Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.253089 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vsfp4" event={"ID":"efc939c9-c470-49ad-aa1f-cef315df8594","Type":"ContainerDied","Data":"d6f7074d93aaef589bc935f694e2d12082a5a977dd0619fbfa03bcb7292208bb"} Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.253105 5094 scope.go:117] "RemoveContainer" containerID="1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.258115 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efc939c9-c470-49ad-aa1f-cef315df8594" (UID: "efc939c9-c470-49ad-aa1f-cef315df8594"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.276692 5094 scope.go:117] "RemoveContainer" containerID="a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.294568 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.294600 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g4ct\" (UniqueName: \"kubernetes.io/projected/efc939c9-c470-49ad-aa1f-cef315df8594-kube-api-access-4g4ct\") on node \"crc\" DevicePath \"\"" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.294611 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efc939c9-c470-49ad-aa1f-cef315df8594-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.297085 5094 scope.go:117] "RemoveContainer" containerID="27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.337263 5094 scope.go:117] "RemoveContainer" containerID="1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" Feb 20 09:52:38 crc kubenswrapper[5094]: E0220 09:52:38.337719 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153\": container with ID starting with 1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153 not found: ID does not exist" containerID="1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.337772 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153"} err="failed to get container status \"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153\": rpc error: code = NotFound desc = could not find container \"1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153\": container with ID starting with 1fe924a4977428558f95a93b1fe697e7f7fb7904a6ff0282d16c4b0dc17a4153 not found: ID does not exist" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.337800 5094 scope.go:117] "RemoveContainer" containerID="a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c" Feb 20 09:52:38 crc kubenswrapper[5094]: E0220 09:52:38.338096 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c\": container with ID starting with a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c not found: ID does not exist" containerID="a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.338129 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c"} err="failed to get container status \"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c\": rpc error: code = NotFound desc = could not find container \"a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c\": container with ID starting with a584c7044025912681642a3c255cf431b3a697e6309a87e6ce161da58ed43b0c not found: ID does not exist" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.338151 5094 scope.go:117] "RemoveContainer" containerID="27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9" Feb 20 09:52:38 crc kubenswrapper[5094]: E0220 09:52:38.338384 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9\": container with ID starting with 27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9 not found: ID does not exist" containerID="27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.338414 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9"} err="failed to get container status \"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9\": rpc error: code = NotFound desc = could not find container \"27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9\": container with ID starting with 27aa8e074cf94425baae3d26fc77836a8417361144044c62aaa5db7d2e4906b9 not found: ID does not exist" Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.599451 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:38 crc kubenswrapper[5094]: I0220 09:52:38.608203 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vsfp4"] Feb 20 09:52:39 crc kubenswrapper[5094]: I0220 09:52:39.852812 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" path="/var/lib/kubelet/pods/efc939c9-c470-49ad-aa1f-cef315df8594/volumes" Feb 20 09:52:48 crc kubenswrapper[5094]: I0220 09:52:48.840086 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:52:48 crc kubenswrapper[5094]: E0220 09:52:48.840788 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:02 crc kubenswrapper[5094]: I0220 09:53:02.841289 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:02 crc kubenswrapper[5094]: E0220 09:53:02.842735 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:14 crc kubenswrapper[5094]: I0220 09:53:14.840822 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:14 crc kubenswrapper[5094]: E0220 09:53:14.841822 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:25 crc kubenswrapper[5094]: I0220 09:53:25.852080 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:25 crc kubenswrapper[5094]: E0220 09:53:25.853307 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.673446 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:26 crc kubenswrapper[5094]: E0220 09:53:26.674540 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="registry-server" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.674562 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="registry-server" Feb 20 09:53:26 crc kubenswrapper[5094]: E0220 09:53:26.674600 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="extract-utilities" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.674610 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="extract-utilities" Feb 20 09:53:26 crc kubenswrapper[5094]: E0220 09:53:26.674660 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="extract-content" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.674669 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="extract-content" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.674957 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="efc939c9-c470-49ad-aa1f-cef315df8594" containerName="registry-server" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.676872 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.687247 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.689848 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.689945 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.689997 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.791650 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.791843 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.791923 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.792639 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.793277 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:26 crc kubenswrapper[5094]: I0220 09:53:26.811962 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") pod \"redhat-marketplace-q457n\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.010676 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.510112 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.775240 5094 generic.go:334] "Generic (PLEG): container finished" podID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerID="c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d" exitCode=0 Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.775299 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerDied","Data":"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d"} Feb 20 09:53:27 crc kubenswrapper[5094]: I0220 09:53:27.775595 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerStarted","Data":"face48fb76ab2bf77d89b9d776328f22131f9e1aaa8d0edf5235ddc072c31877"} Feb 20 09:53:28 crc kubenswrapper[5094]: I0220 09:53:28.786492 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerStarted","Data":"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0"} Feb 20 09:53:29 crc kubenswrapper[5094]: I0220 09:53:29.798640 5094 generic.go:334] "Generic (PLEG): container finished" podID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerID="755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0" exitCode=0 Feb 20 09:53:29 crc kubenswrapper[5094]: I0220 09:53:29.798691 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerDied","Data":"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0"} Feb 20 09:53:31 crc kubenswrapper[5094]: I0220 09:53:31.824219 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerStarted","Data":"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3"} Feb 20 09:53:31 crc kubenswrapper[5094]: I0220 09:53:31.879108 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q457n" podStartSLOduration=2.4457314119999998 podStartE2EDuration="5.879064472s" podCreationTimestamp="2026-02-20 09:53:26 +0000 UTC" firstStartedPulling="2026-02-20 09:53:27.777445747 +0000 UTC m=+11222.650072448" lastFinishedPulling="2026-02-20 09:53:31.210778797 +0000 UTC m=+11226.083405508" observedRunningTime="2026-02-20 09:53:31.856193936 +0000 UTC m=+11226.728820667" watchObservedRunningTime="2026-02-20 09:53:31.879064472 +0000 UTC m=+11226.751691183" Feb 20 09:53:37 crc kubenswrapper[5094]: I0220 09:53:37.011391 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:37 crc kubenswrapper[5094]: I0220 09:53:37.011983 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:37 crc kubenswrapper[5094]: I0220 09:53:37.075155 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:37 crc kubenswrapper[5094]: I0220 09:53:37.943017 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:38 crc kubenswrapper[5094]: I0220 09:53:38.009866 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:38 crc kubenswrapper[5094]: I0220 09:53:38.840096 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:38 crc kubenswrapper[5094]: E0220 09:53:38.840591 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:53:39 crc kubenswrapper[5094]: I0220 09:53:39.908761 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q457n" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="registry-server" containerID="cri-o://7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" gracePeriod=2 Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.635162 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.784665 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") pod \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.784752 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") pod \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.784835 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") pod \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\" (UID: \"ad834bd9-9f2a-4b16-a79b-4c66429e20f8\") " Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.787188 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities" (OuterVolumeSpecName: "utilities") pod "ad834bd9-9f2a-4b16-a79b-4c66429e20f8" (UID: "ad834bd9-9f2a-4b16-a79b-4c66429e20f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.801015 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb" (OuterVolumeSpecName: "kube-api-access-k8bnb") pod "ad834bd9-9f2a-4b16-a79b-4c66429e20f8" (UID: "ad834bd9-9f2a-4b16-a79b-4c66429e20f8"). InnerVolumeSpecName "kube-api-access-k8bnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.808794 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ad834bd9-9f2a-4b16-a79b-4c66429e20f8" (UID: "ad834bd9-9f2a-4b16-a79b-4c66429e20f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.886940 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8bnb\" (UniqueName: \"kubernetes.io/projected/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-kube-api-access-k8bnb\") on node \"crc\" DevicePath \"\"" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.886979 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.886992 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ad834bd9-9f2a-4b16-a79b-4c66429e20f8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922676 5094 generic.go:334] "Generic (PLEG): container finished" podID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerID="7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" exitCode=0 Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922740 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerDied","Data":"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3"} Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922777 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q457n" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922803 5094 scope.go:117] "RemoveContainer" containerID="7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.922790 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q457n" event={"ID":"ad834bd9-9f2a-4b16-a79b-4c66429e20f8","Type":"ContainerDied","Data":"face48fb76ab2bf77d89b9d776328f22131f9e1aaa8d0edf5235ddc072c31877"} Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.946446 5094 scope.go:117] "RemoveContainer" containerID="755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0" Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.960565 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.970739 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q457n"] Feb 20 09:53:40 crc kubenswrapper[5094]: I0220 09:53:40.977414 5094 scope.go:117] "RemoveContainer" containerID="c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.031657 5094 scope.go:117] "RemoveContainer" containerID="7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" Feb 20 09:53:41 crc kubenswrapper[5094]: E0220 09:53:41.032285 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3\": container with ID starting with 7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3 not found: ID does not exist" containerID="7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.032322 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3"} err="failed to get container status \"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3\": rpc error: code = NotFound desc = could not find container \"7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3\": container with ID starting with 7288ca6871af091c0022ad4194db65a5e5e37bd8f3422e075c3f3372d2df6ee3 not found: ID does not exist" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.032342 5094 scope.go:117] "RemoveContainer" containerID="755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0" Feb 20 09:53:41 crc kubenswrapper[5094]: E0220 09:53:41.032641 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0\": container with ID starting with 755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0 not found: ID does not exist" containerID="755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.032662 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0"} err="failed to get container status \"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0\": rpc error: code = NotFound desc = could not find container \"755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0\": container with ID starting with 755c2c518e2215405d0efe3a4a63d2ff18c5e12fcc9943b8705da558428ab6c0 not found: ID does not exist" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.032675 5094 scope.go:117] "RemoveContainer" containerID="c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d" Feb 20 09:53:41 crc kubenswrapper[5094]: E0220 09:53:41.033265 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d\": container with ID starting with c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d not found: ID does not exist" containerID="c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.033300 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d"} err="failed to get container status \"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d\": rpc error: code = NotFound desc = could not find container \"c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d\": container with ID starting with c16488e5fb16ae96037e74b2aae3b51fdba4cbc02b6de899da1a24c12e71f63d not found: ID does not exist" Feb 20 09:53:41 crc kubenswrapper[5094]: I0220 09:53:41.850780 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" path="/var/lib/kubelet/pods/ad834bd9-9f2a-4b16-a79b-4c66429e20f8/volumes" Feb 20 09:53:51 crc kubenswrapper[5094]: I0220 09:53:51.839560 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:53:51 crc kubenswrapper[5094]: E0220 09:53:51.840266 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:06 crc kubenswrapper[5094]: I0220 09:54:06.840126 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:06 crc kubenswrapper[5094]: E0220 09:54:06.841177 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:17 crc kubenswrapper[5094]: I0220 09:54:17.840512 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:17 crc kubenswrapper[5094]: E0220 09:54:17.841404 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:29 crc kubenswrapper[5094]: I0220 09:54:29.840456 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:29 crc kubenswrapper[5094]: E0220 09:54:29.841335 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:40 crc kubenswrapper[5094]: I0220 09:54:40.841053 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:40 crc kubenswrapper[5094]: E0220 09:54:40.842091 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.766484 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:54:44 crc kubenswrapper[5094]: E0220 09:54:44.767411 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="registry-server" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.767423 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="registry-server" Feb 20 09:54:44 crc kubenswrapper[5094]: E0220 09:54:44.767433 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="extract-utilities" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.767440 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="extract-utilities" Feb 20 09:54:44 crc kubenswrapper[5094]: E0220 09:54:44.767468 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="extract-content" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.767475 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="extract-content" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.767681 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad834bd9-9f2a-4b16-a79b-4c66429e20f8" containerName="registry-server" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.769152 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.782689 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.836186 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.836260 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.836366 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.937857 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.937929 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.938016 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.938373 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.938443 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:44 crc kubenswrapper[5094]: I0220 09:54:44.963622 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") pod \"redhat-operators-z8kfp\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:45 crc kubenswrapper[5094]: I0220 09:54:45.102073 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:45 crc kubenswrapper[5094]: I0220 09:54:45.673546 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:54:45 crc kubenswrapper[5094]: I0220 09:54:45.856983 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerStarted","Data":"cb0f0380618df27adf79ad832178308b9549c7e55f77907064673108fdec1afe"} Feb 20 09:54:46 crc kubenswrapper[5094]: I0220 09:54:46.871163 5094 generic.go:334] "Generic (PLEG): container finished" podID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerID="733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35" exitCode=0 Feb 20 09:54:46 crc kubenswrapper[5094]: I0220 09:54:46.871248 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerDied","Data":"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35"} Feb 20 09:54:46 crc kubenswrapper[5094]: I0220 09:54:46.877648 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 09:54:47 crc kubenswrapper[5094]: I0220 09:54:47.884899 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerStarted","Data":"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb"} Feb 20 09:54:51 crc kubenswrapper[5094]: I0220 09:54:51.841003 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:54:51 crc kubenswrapper[5094]: E0220 09:54:51.841947 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:54:52 crc kubenswrapper[5094]: I0220 09:54:52.936686 5094 generic.go:334] "Generic (PLEG): container finished" podID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerID="4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb" exitCode=0 Feb 20 09:54:52 crc kubenswrapper[5094]: I0220 09:54:52.936746 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerDied","Data":"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb"} Feb 20 09:54:53 crc kubenswrapper[5094]: I0220 09:54:53.950096 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerStarted","Data":"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f"} Feb 20 09:54:53 crc kubenswrapper[5094]: I0220 09:54:53.976922 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-z8kfp" podStartSLOduration=3.508961127 podStartE2EDuration="9.976899807s" podCreationTimestamp="2026-02-20 09:54:44 +0000 UTC" firstStartedPulling="2026-02-20 09:54:46.877447393 +0000 UTC m=+11301.750074104" lastFinishedPulling="2026-02-20 09:54:53.345386073 +0000 UTC m=+11308.218012784" observedRunningTime="2026-02-20 09:54:53.966666634 +0000 UTC m=+11308.839293365" watchObservedRunningTime="2026-02-20 09:54:53.976899807 +0000 UTC m=+11308.849526518" Feb 20 09:54:55 crc kubenswrapper[5094]: I0220 09:54:55.102629 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:55 crc kubenswrapper[5094]: I0220 09:54:55.102922 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:54:56 crc kubenswrapper[5094]: I0220 09:54:56.158590 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-z8kfp" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" probeResult="failure" output=< Feb 20 09:54:56 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 09:54:56 crc kubenswrapper[5094]: > Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.575506 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.578869 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.586943 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.632378 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.632515 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.632626 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735038 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735186 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735276 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735524 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.735793 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.756601 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") pod \"certified-operators-6swkl\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:54:59 crc kubenswrapper[5094]: I0220 09:54:59.927843 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:00 crc kubenswrapper[5094]: I0220 09:55:00.426413 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:55:00 crc kubenswrapper[5094]: W0220 09:55:00.426492 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c1bdcc1_da0d_4443_a24b_2f521f7b24db.slice/crio-d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c WatchSource:0}: Error finding container d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c: Status 404 returned error can't find the container with id d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c Feb 20 09:55:01 crc kubenswrapper[5094]: I0220 09:55:01.013430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerDied","Data":"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8"} Feb 20 09:55:01 crc kubenswrapper[5094]: I0220 09:55:01.013291 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerID="3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8" exitCode=0 Feb 20 09:55:01 crc kubenswrapper[5094]: I0220 09:55:01.013818 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerStarted","Data":"d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c"} Feb 20 09:55:02 crc kubenswrapper[5094]: I0220 09:55:02.025650 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerStarted","Data":"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827"} Feb 20 09:55:04 crc kubenswrapper[5094]: I0220 09:55:04.046008 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerID="15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827" exitCode=0 Feb 20 09:55:04 crc kubenswrapper[5094]: I0220 09:55:04.046080 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerDied","Data":"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827"} Feb 20 09:55:04 crc kubenswrapper[5094]: I0220 09:55:04.840065 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:04 crc kubenswrapper[5094]: E0220 09:55:04.840556 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:55:05 crc kubenswrapper[5094]: I0220 09:55:05.056553 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerStarted","Data":"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b"} Feb 20 09:55:05 crc kubenswrapper[5094]: I0220 09:55:05.084055 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6swkl" podStartSLOduration=2.652506004 podStartE2EDuration="6.08403166s" podCreationTimestamp="2026-02-20 09:54:59 +0000 UTC" firstStartedPulling="2026-02-20 09:55:01.016109568 +0000 UTC m=+11315.888736279" lastFinishedPulling="2026-02-20 09:55:04.447635224 +0000 UTC m=+11319.320261935" observedRunningTime="2026-02-20 09:55:05.079550819 +0000 UTC m=+11319.952177550" watchObservedRunningTime="2026-02-20 09:55:05.08403166 +0000 UTC m=+11319.956658371" Feb 20 09:55:05 crc kubenswrapper[5094]: I0220 09:55:05.156044 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:55:05 crc kubenswrapper[5094]: I0220 09:55:05.226517 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:55:06 crc kubenswrapper[5094]: I0220 09:55:06.952673 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.072342 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-z8kfp" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" containerID="cri-o://6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" gracePeriod=2 Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.756203 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.816557 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") pod \"95eeef2c-7672-4fd2-a004-9b285e87b509\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.816683 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") pod \"95eeef2c-7672-4fd2-a004-9b285e87b509\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.816827 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") pod \"95eeef2c-7672-4fd2-a004-9b285e87b509\" (UID: \"95eeef2c-7672-4fd2-a004-9b285e87b509\") " Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.818172 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities" (OuterVolumeSpecName: "utilities") pod "95eeef2c-7672-4fd2-a004-9b285e87b509" (UID: "95eeef2c-7672-4fd2-a004-9b285e87b509"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.835445 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg" (OuterVolumeSpecName: "kube-api-access-wcfmg") pod "95eeef2c-7672-4fd2-a004-9b285e87b509" (UID: "95eeef2c-7672-4fd2-a004-9b285e87b509"). InnerVolumeSpecName "kube-api-access-wcfmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.919490 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.919525 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcfmg\" (UniqueName: \"kubernetes.io/projected/95eeef2c-7672-4fd2-a004-9b285e87b509-kube-api-access-wcfmg\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:07 crc kubenswrapper[5094]: I0220 09:55:07.959383 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95eeef2c-7672-4fd2-a004-9b285e87b509" (UID: "95eeef2c-7672-4fd2-a004-9b285e87b509"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.021026 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95eeef2c-7672-4fd2-a004-9b285e87b509-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104458 5094 generic.go:334] "Generic (PLEG): container finished" podID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerID="6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" exitCode=0 Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104514 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerDied","Data":"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f"} Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104547 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-z8kfp" event={"ID":"95eeef2c-7672-4fd2-a004-9b285e87b509","Type":"ContainerDied","Data":"cb0f0380618df27adf79ad832178308b9549c7e55f77907064673108fdec1afe"} Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104581 5094 scope.go:117] "RemoveContainer" containerID="6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.104640 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-z8kfp" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.142583 5094 scope.go:117] "RemoveContainer" containerID="4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.144927 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.159689 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-z8kfp"] Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.179274 5094 scope.go:117] "RemoveContainer" containerID="733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.227925 5094 scope.go:117] "RemoveContainer" containerID="6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" Feb 20 09:55:08 crc kubenswrapper[5094]: E0220 09:55:08.229302 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f\": container with ID starting with 6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f not found: ID does not exist" containerID="6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.229610 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f"} err="failed to get container status \"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f\": rpc error: code = NotFound desc = could not find container \"6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f\": container with ID starting with 6fbd7c0cf87af4612d1030e214d2b13aaa3b54c21c11da74a8bdc769bfa9206f not found: ID does not exist" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.229636 5094 scope.go:117] "RemoveContainer" containerID="4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb" Feb 20 09:55:08 crc kubenswrapper[5094]: E0220 09:55:08.229991 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb\": container with ID starting with 4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb not found: ID does not exist" containerID="4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.230028 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb"} err="failed to get container status \"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb\": rpc error: code = NotFound desc = could not find container \"4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb\": container with ID starting with 4ec1be828492e4f64e6a996ef9a73b351bfe9297664ce17a592b38d797f83ccb not found: ID does not exist" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.230053 5094 scope.go:117] "RemoveContainer" containerID="733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35" Feb 20 09:55:08 crc kubenswrapper[5094]: E0220 09:55:08.233301 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35\": container with ID starting with 733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35 not found: ID does not exist" containerID="733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35" Feb 20 09:55:08 crc kubenswrapper[5094]: I0220 09:55:08.233326 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35"} err="failed to get container status \"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35\": rpc error: code = NotFound desc = could not find container \"733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35\": container with ID starting with 733b415eecb17e443b2a977aa65d667050a0186d232caf045539d883be4a2e35 not found: ID does not exist" Feb 20 09:55:08 crc kubenswrapper[5094]: E0220 09:55:08.282249 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95eeef2c_7672_4fd2_a004_9b285e87b509.slice/crio-cb0f0380618df27adf79ad832178308b9549c7e55f77907064673108fdec1afe\": RecentStats: unable to find data in memory cache]" Feb 20 09:55:09 crc kubenswrapper[5094]: I0220 09:55:09.850790 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" path="/var/lib/kubelet/pods/95eeef2c-7672-4fd2-a004-9b285e87b509/volumes" Feb 20 09:55:09 crc kubenswrapper[5094]: I0220 09:55:09.928587 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:09 crc kubenswrapper[5094]: I0220 09:55:09.928633 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:09 crc kubenswrapper[5094]: I0220 09:55:09.982952 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:10 crc kubenswrapper[5094]: I0220 09:55:10.185098 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:11 crc kubenswrapper[5094]: I0220 09:55:11.151399 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.145114 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6swkl" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="registry-server" containerID="cri-o://4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" gracePeriod=2 Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.835528 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.923896 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") pod \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.924079 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") pod \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.924238 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") pod \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\" (UID: \"5c1bdcc1-da0d-4443-a24b-2f521f7b24db\") " Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.925057 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities" (OuterVolumeSpecName: "utilities") pod "5c1bdcc1-da0d-4443-a24b-2f521f7b24db" (UID: "5c1bdcc1-da0d-4443-a24b-2f521f7b24db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:12 crc kubenswrapper[5094]: I0220 09:55:12.930514 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh" (OuterVolumeSpecName: "kube-api-access-spklh") pod "5c1bdcc1-da0d-4443-a24b-2f521f7b24db" (UID: "5c1bdcc1-da0d-4443-a24b-2f521f7b24db"). InnerVolumeSpecName "kube-api-access-spklh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.001443 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c1bdcc1-da0d-4443-a24b-2f521f7b24db" (UID: "5c1bdcc1-da0d-4443-a24b-2f521f7b24db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.025891 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.025926 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.025940 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spklh\" (UniqueName: \"kubernetes.io/projected/5c1bdcc1-da0d-4443-a24b-2f521f7b24db-kube-api-access-spklh\") on node \"crc\" DevicePath \"\"" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154247 5094 generic.go:334] "Generic (PLEG): container finished" podID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerID="4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" exitCode=0 Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154291 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerDied","Data":"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b"} Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154325 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6swkl" event={"ID":"5c1bdcc1-da0d-4443-a24b-2f521f7b24db","Type":"ContainerDied","Data":"d7a5896337c0cc6a0e7364a77cf815cb86c8be932da0cbfaf8920552e7e8750c"} Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154346 5094 scope.go:117] "RemoveContainer" containerID="4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.154339 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6swkl" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.182359 5094 scope.go:117] "RemoveContainer" containerID="15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.190595 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.199791 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6swkl"] Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.208858 5094 scope.go:117] "RemoveContainer" containerID="3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.259983 5094 scope.go:117] "RemoveContainer" containerID="4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" Feb 20 09:55:13 crc kubenswrapper[5094]: E0220 09:55:13.260564 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b\": container with ID starting with 4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b not found: ID does not exist" containerID="4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.260590 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b"} err="failed to get container status \"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b\": rpc error: code = NotFound desc = could not find container \"4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b\": container with ID starting with 4092720bcfb743e860d3b4bab28abea7806f206dce64d8a4c626c69fb75bac3b not found: ID does not exist" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.260610 5094 scope.go:117] "RemoveContainer" containerID="15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827" Feb 20 09:55:13 crc kubenswrapper[5094]: E0220 09:55:13.261284 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827\": container with ID starting with 15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827 not found: ID does not exist" containerID="15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.261487 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827"} err="failed to get container status \"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827\": rpc error: code = NotFound desc = could not find container \"15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827\": container with ID starting with 15aed613ef93b6809b605945723aa28d2ca2f64f99e1321457be80b59c878827 not found: ID does not exist" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.261585 5094 scope.go:117] "RemoveContainer" containerID="3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8" Feb 20 09:55:13 crc kubenswrapper[5094]: E0220 09:55:13.265809 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8\": container with ID starting with 3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8 not found: ID does not exist" containerID="3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.265841 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8"} err="failed to get container status \"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8\": rpc error: code = NotFound desc = could not find container \"3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8\": container with ID starting with 3f817346f66c47d2c53704651b8c09d5875145007ac55393f4b849a904c6cee8 not found: ID does not exist" Feb 20 09:55:13 crc kubenswrapper[5094]: I0220 09:55:13.851823 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" path="/var/lib/kubelet/pods/5c1bdcc1-da0d-4443-a24b-2f521f7b24db/volumes" Feb 20 09:55:17 crc kubenswrapper[5094]: I0220 09:55:17.840621 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:17 crc kubenswrapper[5094]: E0220 09:55:17.841395 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:55:28 crc kubenswrapper[5094]: I0220 09:55:28.840850 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:28 crc kubenswrapper[5094]: E0220 09:55:28.841558 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:55:40 crc kubenswrapper[5094]: I0220 09:55:40.841425 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:40 crc kubenswrapper[5094]: E0220 09:55:40.842152 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:55:51 crc kubenswrapper[5094]: I0220 09:55:51.840156 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:55:51 crc kubenswrapper[5094]: E0220 09:55:51.841032 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:56:06 crc kubenswrapper[5094]: I0220 09:56:06.840466 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:56:06 crc kubenswrapper[5094]: E0220 09:56:06.841381 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:56:21 crc kubenswrapper[5094]: I0220 09:56:21.841016 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:56:21 crc kubenswrapper[5094]: E0220 09:56:21.851529 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:56:35 crc kubenswrapper[5094]: I0220 09:56:35.851788 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:56:35 crc kubenswrapper[5094]: E0220 09:56:35.852669 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:56:49 crc kubenswrapper[5094]: I0220 09:56:49.840860 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:56:49 crc kubenswrapper[5094]: E0220 09:56:49.841684 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:57:03 crc kubenswrapper[5094]: I0220 09:57:03.841112 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:57:03 crc kubenswrapper[5094]: E0220 09:57:03.841917 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:57:15 crc kubenswrapper[5094]: I0220 09:57:15.846831 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:57:15 crc kubenswrapper[5094]: E0220 09:57:15.847608 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:57:29 crc kubenswrapper[5094]: I0220 09:57:29.840265 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:57:29 crc kubenswrapper[5094]: E0220 09:57:29.840998 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 09:57:40 crc kubenswrapper[5094]: I0220 09:57:40.840929 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 09:57:41 crc kubenswrapper[5094]: I0220 09:57:41.613549 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1"} Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.206547 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56"] Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207550 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207563 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207584 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="extract-utilities" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207591 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="extract-utilities" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207607 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207614 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207627 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="extract-utilities" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207633 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="extract-utilities" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207645 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="extract-content" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207650 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="extract-content" Feb 20 10:00:00 crc kubenswrapper[5094]: E0220 10:00:00.207661 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="extract-content" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207668 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="extract-content" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207887 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c1bdcc1-da0d-4443-a24b-2f521f7b24db" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.207915 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="95eeef2c-7672-4fd2-a004-9b285e87b509" containerName="registry-server" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.208661 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.210366 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.210489 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.218538 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56"] Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.267683 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.267787 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.267890 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.370174 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.370333 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.370358 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.371506 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.385933 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.386394 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") pod \"collect-profiles-29526360-cnw56\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:00 crc kubenswrapper[5094]: I0220 10:00:00.545765 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:01 crc kubenswrapper[5094]: I0220 10:00:01.165093 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56"] Feb 20 10:00:02 crc kubenswrapper[5094]: I0220 10:00:02.048430 5094 generic.go:334] "Generic (PLEG): container finished" podID="c403611d-ad00-4c45-be94-83ae57d9a75f" containerID="06958deee8569d614e28d60783078685bc1897a15343fd5aeef03a44767b5c64" exitCode=0 Feb 20 10:00:02 crc kubenswrapper[5094]: I0220 10:00:02.048532 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" event={"ID":"c403611d-ad00-4c45-be94-83ae57d9a75f","Type":"ContainerDied","Data":"06958deee8569d614e28d60783078685bc1897a15343fd5aeef03a44767b5c64"} Feb 20 10:00:02 crc kubenswrapper[5094]: I0220 10:00:02.048727 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" event={"ID":"c403611d-ad00-4c45-be94-83ae57d9a75f","Type":"ContainerStarted","Data":"d613a215ebd306e51f10950d4c1029ae3399dedbaf5e5a148636460b587a851f"} Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.610349 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.739294 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") pod \"c403611d-ad00-4c45-be94-83ae57d9a75f\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.739385 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") pod \"c403611d-ad00-4c45-be94-83ae57d9a75f\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.739524 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") pod \"c403611d-ad00-4c45-be94-83ae57d9a75f\" (UID: \"c403611d-ad00-4c45-be94-83ae57d9a75f\") " Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.740243 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume" (OuterVolumeSpecName: "config-volume") pod "c403611d-ad00-4c45-be94-83ae57d9a75f" (UID: "c403611d-ad00-4c45-be94-83ae57d9a75f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.746020 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv" (OuterVolumeSpecName: "kube-api-access-85ctv") pod "c403611d-ad00-4c45-be94-83ae57d9a75f" (UID: "c403611d-ad00-4c45-be94-83ae57d9a75f"). InnerVolumeSpecName "kube-api-access-85ctv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.750830 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c403611d-ad00-4c45-be94-83ae57d9a75f" (UID: "c403611d-ad00-4c45-be94-83ae57d9a75f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.841761 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85ctv\" (UniqueName: \"kubernetes.io/projected/c403611d-ad00-4c45-be94-83ae57d9a75f-kube-api-access-85ctv\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.842101 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c403611d-ad00-4c45-be94-83ae57d9a75f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:03 crc kubenswrapper[5094]: I0220 10:00:03.842122 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c403611d-ad00-4c45-be94-83ae57d9a75f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.088942 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" event={"ID":"c403611d-ad00-4c45-be94-83ae57d9a75f","Type":"ContainerDied","Data":"d613a215ebd306e51f10950d4c1029ae3399dedbaf5e5a148636460b587a851f"} Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.089042 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d613a215ebd306e51f10950d4c1029ae3399dedbaf5e5a148636460b587a851f" Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.089119 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526360-cnw56" Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.107390 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.107597 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.783980 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 10:00:04 crc kubenswrapper[5094]: I0220 10:00:04.797089 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526315-md8jm"] Feb 20 10:00:05 crc kubenswrapper[5094]: I0220 10:00:05.851497 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1" path="/var/lib/kubelet/pods/ef8c3ef5-0034-42fc-b91b-c7574fe5dcc1/volumes" Feb 20 10:00:34 crc kubenswrapper[5094]: I0220 10:00:34.107238 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:00:34 crc kubenswrapper[5094]: I0220 10:00:34.107803 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:00:45 crc kubenswrapper[5094]: I0220 10:00:45.969146 5094 scope.go:117] "RemoveContainer" containerID="54cabe5f22fe8cc34888629b6ad81ec4c78f22e5ddf13beea44532e8ad37533e" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.147995 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29526361-srdg8"] Feb 20 10:01:00 crc kubenswrapper[5094]: E0220 10:01:00.148978 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c403611d-ad00-4c45-be94-83ae57d9a75f" containerName="collect-profiles" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.148991 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="c403611d-ad00-4c45-be94-83ae57d9a75f" containerName="collect-profiles" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.149209 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="c403611d-ad00-4c45-be94-83ae57d9a75f" containerName="collect-profiles" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.150029 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.169933 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526361-srdg8"] Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.246374 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.246446 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.246831 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.246952 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.348953 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.349208 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.349269 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.349357 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.356053 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.356825 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.357678 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.369946 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") pod \"keystone-cron-29526361-srdg8\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:00 crc kubenswrapper[5094]: I0220 10:01:00.476262 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:01 crc kubenswrapper[5094]: I0220 10:01:01.123499 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29526361-srdg8"] Feb 20 10:01:01 crc kubenswrapper[5094]: I0220 10:01:01.665480 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526361-srdg8" event={"ID":"5a279e74-ee64-4ff9-8a0f-2700c30a770d","Type":"ContainerStarted","Data":"e01a6f191157f2a182b310659da19066acb8623209878837eb3ecfca8150fc03"} Feb 20 10:01:01 crc kubenswrapper[5094]: I0220 10:01:01.666237 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526361-srdg8" event={"ID":"5a279e74-ee64-4ff9-8a0f-2700c30a770d","Type":"ContainerStarted","Data":"866d1dd340cedbaa173010fb9fe515d661516c32b1e799f0b7ad9a2f543304b4"} Feb 20 10:01:01 crc kubenswrapper[5094]: I0220 10:01:01.687858 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29526361-srdg8" podStartSLOduration=1.687837838 podStartE2EDuration="1.687837838s" podCreationTimestamp="2026-02-20 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-20 10:01:01.680962877 +0000 UTC m=+11676.553589588" watchObservedRunningTime="2026-02-20 10:01:01.687837838 +0000 UTC m=+11676.560464549" Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.107184 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.107851 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.107997 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.109573 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.109651 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1" gracePeriod=600 Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.694160 5094 generic.go:334] "Generic (PLEG): container finished" podID="5a279e74-ee64-4ff9-8a0f-2700c30a770d" containerID="e01a6f191157f2a182b310659da19066acb8623209878837eb3ecfca8150fc03" exitCode=0 Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.694272 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526361-srdg8" event={"ID":"5a279e74-ee64-4ff9-8a0f-2700c30a770d","Type":"ContainerDied","Data":"e01a6f191157f2a182b310659da19066acb8623209878837eb3ecfca8150fc03"} Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.698670 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1" exitCode=0 Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.698734 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1"} Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.698763 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf"} Feb 20 10:01:04 crc kubenswrapper[5094]: I0220 10:01:04.698783 5094 scope.go:117] "RemoveContainer" containerID="ba9c852a6bec08d0245f5b2e3af90cbc1240b1e101fcdbc5bee17de710d229f1" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.336369 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.381046 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") pod \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.381230 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") pod \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.381326 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") pod \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.381362 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") pod \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\" (UID: \"5a279e74-ee64-4ff9-8a0f-2700c30a770d\") " Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.388871 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5a279e74-ee64-4ff9-8a0f-2700c30a770d" (UID: "5a279e74-ee64-4ff9-8a0f-2700c30a770d"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.395002 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8" (OuterVolumeSpecName: "kube-api-access-v9ck8") pod "5a279e74-ee64-4ff9-8a0f-2700c30a770d" (UID: "5a279e74-ee64-4ff9-8a0f-2700c30a770d"). InnerVolumeSpecName "kube-api-access-v9ck8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.433905 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5a279e74-ee64-4ff9-8a0f-2700c30a770d" (UID: "5a279e74-ee64-4ff9-8a0f-2700c30a770d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.466405 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data" (OuterVolumeSpecName: "config-data") pod "5a279e74-ee64-4ff9-8a0f-2700c30a770d" (UID: "5a279e74-ee64-4ff9-8a0f-2700c30a770d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.484114 5094 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.484140 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9ck8\" (UniqueName: \"kubernetes.io/projected/5a279e74-ee64-4ff9-8a0f-2700c30a770d-kube-api-access-v9ck8\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.484151 5094 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.484158 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5a279e74-ee64-4ff9-8a0f-2700c30a770d-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.729390 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29526361-srdg8" event={"ID":"5a279e74-ee64-4ff9-8a0f-2700c30a770d","Type":"ContainerDied","Data":"866d1dd340cedbaa173010fb9fe515d661516c32b1e799f0b7ad9a2f543304b4"} Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.729430 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="866d1dd340cedbaa173010fb9fe515d661516c32b1e799f0b7ad9a2f543304b4" Feb 20 10:01:06 crc kubenswrapper[5094]: I0220 10:01:06.729482 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29526361-srdg8" Feb 20 10:03:04 crc kubenswrapper[5094]: I0220 10:03:04.107181 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:03:04 crc kubenswrapper[5094]: I0220 10:03:04.107770 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:03:34 crc kubenswrapper[5094]: I0220 10:03:34.106830 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:03:34 crc kubenswrapper[5094]: I0220 10:03:34.107492 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.106686 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.107480 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.107535 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.108651 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.108791 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" gracePeriod=600 Feb 20 10:04:04 crc kubenswrapper[5094]: E0220 10:04:04.227220 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.285272 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" exitCode=0 Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.285317 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf"} Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.285354 5094 scope.go:117] "RemoveContainer" containerID="93e51fc131dd81ce33232a8e4dd68a98c48acfa4e4dbb0b00dd430da14bb99d1" Feb 20 10:04:04 crc kubenswrapper[5094]: I0220 10:04:04.286215 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:04 crc kubenswrapper[5094]: E0220 10:04:04.286831 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:16 crc kubenswrapper[5094]: I0220 10:04:16.861630 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:16 crc kubenswrapper[5094]: E0220 10:04:16.864788 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:27 crc kubenswrapper[5094]: I0220 10:04:27.840994 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:27 crc kubenswrapper[5094]: E0220 10:04:27.841875 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:42 crc kubenswrapper[5094]: I0220 10:04:42.841300 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:42 crc kubenswrapper[5094]: E0220 10:04:42.842369 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:52 crc kubenswrapper[5094]: I0220 10:04:52.876179 5094 generic.go:334] "Generic (PLEG): container finished" podID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" containerID="2c94eb29c8d1a5480a6899dd7732430e17544eb4ff0b06be93fdd212c2a48558" exitCode=0 Feb 20 10:04:52 crc kubenswrapper[5094]: I0220 10:04:52.876435 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9","Type":"ContainerDied","Data":"2c94eb29c8d1a5480a6899dd7732430e17544eb4ff0b06be93fdd212c2a48558"} Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.380546 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.532868 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.532944 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.532981 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533033 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533119 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533148 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533173 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533261 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533278 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") pod \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\" (UID: \"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9\") " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.533804 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data" (OuterVolumeSpecName: "config-data") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.538526 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.555393 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.574274 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.580044 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29" (OuterVolumeSpecName: "kube-api-access-v7j29") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "kube-api-access-v7j29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.584035 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.590815 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.603305 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.603767 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" (UID: "8e2aa894-2a09-4fad-bcc7-1f259ca48ac9"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636090 5094 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-config-data\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636126 5094 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636141 5094 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636157 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7j29\" (UniqueName: \"kubernetes.io/projected/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-kube-api-access-v7j29\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636196 5094 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636208 5094 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636220 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636233 5094 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.636244 5094 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/8e2aa894-2a09-4fad-bcc7-1f259ca48ac9-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.661002 5094 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.740758 5094 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.904646 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"8e2aa894-2a09-4fad-bcc7-1f259ca48ac9","Type":"ContainerDied","Data":"e3fe55f15912750e6ef07e8fa7b4632b5f9782d82892810f96924fcbf7aff5a8"} Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.904687 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3fe55f15912750e6ef07e8fa7b4632b5f9782d82892810f96924fcbf7aff5a8" Feb 20 10:04:54 crc kubenswrapper[5094]: I0220 10:04:54.904783 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 20 10:04:55 crc kubenswrapper[5094]: I0220 10:04:55.841061 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:04:55 crc kubenswrapper[5094]: E0220 10:04:55.841914 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.772142 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 20 10:04:58 crc kubenswrapper[5094]: E0220 10:04:58.773524 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" containerName="tempest-tests-tempest-tests-runner" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.773549 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" containerName="tempest-tests-tempest-tests-runner" Feb 20 10:04:58 crc kubenswrapper[5094]: E0220 10:04:58.773607 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a279e74-ee64-4ff9-8a0f-2700c30a770d" containerName="keystone-cron" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.773620 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a279e74-ee64-4ff9-8a0f-2700c30a770d" containerName="keystone-cron" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.774061 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2aa894-2a09-4fad-bcc7-1f259ca48ac9" containerName="tempest-tests-tempest-tests-runner" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.774085 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a279e74-ee64-4ff9-8a0f-2700c30a770d" containerName="keystone-cron" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.775566 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.778608 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-d88p7" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.785701 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.934901 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:58 crc kubenswrapper[5094]: I0220 10:04:58.935082 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvgx9\" (UniqueName: \"kubernetes.io/projected/6798c144-2ada-4a54-98c4-72db0e7bd732-kube-api-access-qvgx9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.041625 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvgx9\" (UniqueName: \"kubernetes.io/projected/6798c144-2ada-4a54-98c4-72db0e7bd732-kube-api-access-qvgx9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.045533 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.046334 5094 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.080950 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvgx9\" (UniqueName: \"kubernetes.io/projected/6798c144-2ada-4a54-98c4-72db0e7bd732-kube-api-access-qvgx9\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.091485 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"6798c144-2ada-4a54-98c4-72db0e7bd732\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.101860 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 20 10:04:59 crc kubenswrapper[5094]: W0220 10:04:59.682987 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6798c144_2ada_4a54_98c4_72db0e7bd732.slice/crio-faa5652cd80e1bf603556f618cfb1e0c37581b32692c0405d3e45fc435b025e2 WatchSource:0}: Error finding container faa5652cd80e1bf603556f618cfb1e0c37581b32692c0405d3e45fc435b025e2: Status 404 returned error can't find the container with id faa5652cd80e1bf603556f618cfb1e0c37581b32692c0405d3e45fc435b025e2 Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.688110 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.689934 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:04:59 crc kubenswrapper[5094]: I0220 10:04:59.966579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6798c144-2ada-4a54-98c4-72db0e7bd732","Type":"ContainerStarted","Data":"faa5652cd80e1bf603556f618cfb1e0c37581b32692c0405d3e45fc435b025e2"} Feb 20 10:05:00 crc kubenswrapper[5094]: I0220 10:05:00.979453 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"6798c144-2ada-4a54-98c4-72db0e7bd732","Type":"ContainerStarted","Data":"fd52b835d21d44adbc9e9b037575e81b3574cf3fc67df2488927d93b8648fb3b"} Feb 20 10:05:01 crc kubenswrapper[5094]: I0220 10:05:01.002869 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.140427751 podStartE2EDuration="3.002850243s" podCreationTimestamp="2026-02-20 10:04:58 +0000 UTC" firstStartedPulling="2026-02-20 10:04:59.689510098 +0000 UTC m=+11914.562136829" lastFinishedPulling="2026-02-20 10:05:00.5519326 +0000 UTC m=+11915.424559321" observedRunningTime="2026-02-20 10:05:00.993639734 +0000 UTC m=+11915.866266455" watchObservedRunningTime="2026-02-20 10:05:01.002850243 +0000 UTC m=+11915.875476964" Feb 20 10:05:09 crc kubenswrapper[5094]: I0220 10:05:09.840921 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:09 crc kubenswrapper[5094]: E0220 10:05:09.841997 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:21 crc kubenswrapper[5094]: I0220 10:05:21.840254 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:21 crc kubenswrapper[5094]: E0220 10:05:21.841564 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:34 crc kubenswrapper[5094]: I0220 10:05:34.841632 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:34 crc kubenswrapper[5094]: E0220 10:05:34.843300 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:45 crc kubenswrapper[5094]: I0220 10:05:45.849966 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:45 crc kubenswrapper[5094]: E0220 10:05:45.850765 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:46 crc kubenswrapper[5094]: I0220 10:05:46.922006 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:05:46 crc kubenswrapper[5094]: I0220 10:05:46.929343 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:46 crc kubenswrapper[5094]: I0220 10:05:46.945092 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.031584 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.031974 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.032138 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.135826 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.136029 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.136131 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.136888 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.137132 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.169660 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") pod \"certified-operators-jvm7h\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.260302 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:47 crc kubenswrapper[5094]: I0220 10:05:47.801392 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:05:48 crc kubenswrapper[5094]: I0220 10:05:48.678854 5094 generic.go:334] "Generic (PLEG): container finished" podID="da91a28e-9af4-44ae-a45e-542551dc917c" containerID="643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045" exitCode=0 Feb 20 10:05:48 crc kubenswrapper[5094]: I0220 10:05:48.678951 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerDied","Data":"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045"} Feb 20 10:05:48 crc kubenswrapper[5094]: I0220 10:05:48.679448 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerStarted","Data":"b9d4df055f589b6802cc89131a9aee1fa42a3cbe8017b6961f057dd461070128"} Feb 20 10:05:49 crc kubenswrapper[5094]: I0220 10:05:49.691891 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerStarted","Data":"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b"} Feb 20 10:05:51 crc kubenswrapper[5094]: I0220 10:05:51.724299 5094 generic.go:334] "Generic (PLEG): container finished" podID="da91a28e-9af4-44ae-a45e-542551dc917c" containerID="456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b" exitCode=0 Feb 20 10:05:51 crc kubenswrapper[5094]: I0220 10:05:51.724394 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerDied","Data":"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b"} Feb 20 10:05:52 crc kubenswrapper[5094]: I0220 10:05:52.758621 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerStarted","Data":"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c"} Feb 20 10:05:52 crc kubenswrapper[5094]: I0220 10:05:52.800732 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jvm7h" podStartSLOduration=3.388236217 podStartE2EDuration="6.80069024s" podCreationTimestamp="2026-02-20 10:05:46 +0000 UTC" firstStartedPulling="2026-02-20 10:05:48.683387696 +0000 UTC m=+11963.556014417" lastFinishedPulling="2026-02-20 10:05:52.095841689 +0000 UTC m=+11966.968468440" observedRunningTime="2026-02-20 10:05:52.779024743 +0000 UTC m=+11967.651651464" watchObservedRunningTime="2026-02-20 10:05:52.80069024 +0000 UTC m=+11967.673316961" Feb 20 10:05:56 crc kubenswrapper[5094]: I0220 10:05:56.840359 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:05:56 crc kubenswrapper[5094]: E0220 10:05:56.841328 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.260821 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.260968 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.334505 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.898649 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:05:57 crc kubenswrapper[5094]: I0220 10:05:57.962725 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:05:59 crc kubenswrapper[5094]: I0220 10:05:59.854109 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jvm7h" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="registry-server" containerID="cri-o://463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" gracePeriod=2 Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.345696 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.379444 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") pod \"da91a28e-9af4-44ae-a45e-542551dc917c\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.379668 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") pod \"da91a28e-9af4-44ae-a45e-542551dc917c\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.380009 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") pod \"da91a28e-9af4-44ae-a45e-542551dc917c\" (UID: \"da91a28e-9af4-44ae-a45e-542551dc917c\") " Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.380610 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities" (OuterVolumeSpecName: "utilities") pod "da91a28e-9af4-44ae-a45e-542551dc917c" (UID: "da91a28e-9af4-44ae-a45e-542551dc917c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.380884 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.390456 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm" (OuterVolumeSpecName: "kube-api-access-r9gbm") pod "da91a28e-9af4-44ae-a45e-542551dc917c" (UID: "da91a28e-9af4-44ae-a45e-542551dc917c"). InnerVolumeSpecName "kube-api-access-r9gbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.469872 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da91a28e-9af4-44ae-a45e-542551dc917c" (UID: "da91a28e-9af4-44ae-a45e-542551dc917c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.483125 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da91a28e-9af4-44ae-a45e-542551dc917c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.483177 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9gbm\" (UniqueName: \"kubernetes.io/projected/da91a28e-9af4-44ae-a45e-542551dc917c-kube-api-access-r9gbm\") on node \"crc\" DevicePath \"\"" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.866739 5094 generic.go:334] "Generic (PLEG): container finished" podID="da91a28e-9af4-44ae-a45e-542551dc917c" containerID="463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" exitCode=0 Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.866826 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerDied","Data":"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c"} Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.867241 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jvm7h" event={"ID":"da91a28e-9af4-44ae-a45e-542551dc917c","Type":"ContainerDied","Data":"b9d4df055f589b6802cc89131a9aee1fa42a3cbe8017b6961f057dd461070128"} Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.867269 5094 scope.go:117] "RemoveContainer" containerID="463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.866865 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jvm7h" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.900435 5094 scope.go:117] "RemoveContainer" containerID="456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.908862 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.919204 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jvm7h"] Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.952916 5094 scope.go:117] "RemoveContainer" containerID="643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.980263 5094 scope.go:117] "RemoveContainer" containerID="463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" Feb 20 10:06:00 crc kubenswrapper[5094]: E0220 10:06:00.986170 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c\": container with ID starting with 463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c not found: ID does not exist" containerID="463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.986207 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c"} err="failed to get container status \"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c\": rpc error: code = NotFound desc = could not find container \"463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c\": container with ID starting with 463f1aaecc85c5377345b6c575dabab218080f4dba32820025711c0711c3e85c not found: ID does not exist" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.986255 5094 scope.go:117] "RemoveContainer" containerID="456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b" Feb 20 10:06:00 crc kubenswrapper[5094]: E0220 10:06:00.986677 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b\": container with ID starting with 456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b not found: ID does not exist" containerID="456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.986798 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b"} err="failed to get container status \"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b\": rpc error: code = NotFound desc = could not find container \"456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b\": container with ID starting with 456b50afe4baf8b6444d92f51999b9e08874fa6e10737779b4c59ac9974d405b not found: ID does not exist" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.986905 5094 scope.go:117] "RemoveContainer" containerID="643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045" Feb 20 10:06:00 crc kubenswrapper[5094]: E0220 10:06:00.987361 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045\": container with ID starting with 643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045 not found: ID does not exist" containerID="643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045" Feb 20 10:06:00 crc kubenswrapper[5094]: I0220 10:06:00.987410 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045"} err="failed to get container status \"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045\": rpc error: code = NotFound desc = could not find container \"643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045\": container with ID starting with 643acf5c4e38b51acc994c44b2f4a36fa33838309f3581a2417f33e550ad8045 not found: ID does not exist" Feb 20 10:06:01 crc kubenswrapper[5094]: I0220 10:06:01.859284 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" path="/var/lib/kubelet/pods/da91a28e-9af4-44ae-a45e-542551dc917c/volumes" Feb 20 10:06:11 crc kubenswrapper[5094]: I0220 10:06:11.146243 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:06:11 crc kubenswrapper[5094]: E0220 10:06:11.149644 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.005014 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:06:22 crc kubenswrapper[5094]: E0220 10:06:22.005992 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="extract-content" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.006006 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="extract-content" Feb 20 10:06:22 crc kubenswrapper[5094]: E0220 10:06:22.006029 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="registry-server" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.006035 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="registry-server" Feb 20 10:06:22 crc kubenswrapper[5094]: E0220 10:06:22.006063 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="extract-utilities" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.006070 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="extract-utilities" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.006262 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="da91a28e-9af4-44ae-a45e-542551dc917c" containerName="registry-server" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.010014 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.011779 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-pc5cx"/"default-dockercfg-6hz5p" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.014239 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pc5cx"/"openshift-service-ca.crt" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.014244 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-pc5cx"/"kube-root-ca.crt" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.020590 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.085628 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.086067 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.188613 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.188832 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.189233 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.214538 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") pod \"must-gather-n6jcq\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.328466 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:06:22 crc kubenswrapper[5094]: I0220 10:06:22.847757 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:06:23 crc kubenswrapper[5094]: I0220 10:06:23.295153 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" event={"ID":"2578360e-4830-4223-b07f-031c6c2df11e","Type":"ContainerStarted","Data":"2ec39c0a5b5164de0b8530de19fc0bc02b72ed51c4f04ce0cff45952f8f5ad7f"} Feb 20 10:06:24 crc kubenswrapper[5094]: I0220 10:06:24.840427 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:06:24 crc kubenswrapper[5094]: E0220 10:06:24.840968 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:06:31 crc kubenswrapper[5094]: I0220 10:06:31.404344 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" event={"ID":"2578360e-4830-4223-b07f-031c6c2df11e","Type":"ContainerStarted","Data":"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147"} Feb 20 10:06:31 crc kubenswrapper[5094]: I0220 10:06:31.405083 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" event={"ID":"2578360e-4830-4223-b07f-031c6c2df11e","Type":"ContainerStarted","Data":"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899"} Feb 20 10:06:31 crc kubenswrapper[5094]: I0220 10:06:31.421941 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" podStartSLOduration=3.014911463 podStartE2EDuration="10.421917605s" podCreationTimestamp="2026-02-20 10:06:21 +0000 UTC" firstStartedPulling="2026-02-20 10:06:22.85228688 +0000 UTC m=+11997.724913591" lastFinishedPulling="2026-02-20 10:06:30.259293022 +0000 UTC m=+12005.131919733" observedRunningTime="2026-02-20 10:06:31.420294095 +0000 UTC m=+12006.292920806" watchObservedRunningTime="2026-02-20 10:06:31.421917605 +0000 UTC m=+12006.294544316" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.342845 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-4vf4g"] Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.345451 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.449105 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.449153 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.551556 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.551661 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.551818 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.578403 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") pod \"crc-debug-4vf4g\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:36 crc kubenswrapper[5094]: I0220 10:06:36.662212 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:06:37 crc kubenswrapper[5094]: I0220 10:06:37.488199 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" event={"ID":"603fd649-921c-4ddc-aaea-ca487d399cdc","Type":"ContainerStarted","Data":"d8e323d826373496a351adc42cad6ef3517155efe7a5cb6c8fc3f30807bc0181"} Feb 20 10:06:37 crc kubenswrapper[5094]: I0220 10:06:37.840972 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:06:37 crc kubenswrapper[5094]: E0220 10:06:37.841240 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:06:46 crc kubenswrapper[5094]: I0220 10:06:46.593795 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" event={"ID":"603fd649-921c-4ddc-aaea-ca487d399cdc","Type":"ContainerStarted","Data":"5e963cdb7c15a481066283be89f67b32c2bfd26d63091f9de055d580de5cc75f"} Feb 20 10:06:46 crc kubenswrapper[5094]: I0220 10:06:46.627767 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" podStartSLOduration=1.376188005 podStartE2EDuration="10.62774171s" podCreationTimestamp="2026-02-20 10:06:36 +0000 UTC" firstStartedPulling="2026-02-20 10:06:36.725421578 +0000 UTC m=+12011.598048299" lastFinishedPulling="2026-02-20 10:06:45.976975253 +0000 UTC m=+12020.849602004" observedRunningTime="2026-02-20 10:06:46.61196169 +0000 UTC m=+12021.484588431" watchObservedRunningTime="2026-02-20 10:06:46.62774171 +0000 UTC m=+12021.500368461" Feb 20 10:06:50 crc kubenswrapper[5094]: I0220 10:06:50.840958 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:06:50 crc kubenswrapper[5094]: E0220 10:06:50.843344 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:04 crc kubenswrapper[5094]: I0220 10:07:04.841382 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:04 crc kubenswrapper[5094]: E0220 10:07:04.842633 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:16 crc kubenswrapper[5094]: I0220 10:07:16.840223 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:16 crc kubenswrapper[5094]: E0220 10:07:16.841332 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:27 crc kubenswrapper[5094]: I0220 10:07:27.840768 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:27 crc kubenswrapper[5094]: E0220 10:07:27.841973 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.089136 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.091641 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.104725 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.152167 5094 generic.go:334] "Generic (PLEG): container finished" podID="603fd649-921c-4ddc-aaea-ca487d399cdc" containerID="5e963cdb7c15a481066283be89f67b32c2bfd26d63091f9de055d580de5cc75f" exitCode=0 Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.152492 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" event={"ID":"603fd649-921c-4ddc-aaea-ca487d399cdc","Type":"ContainerDied","Data":"5e963cdb7c15a481066283be89f67b32c2bfd26d63091f9de055d580de5cc75f"} Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.240583 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.240808 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.240920 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343162 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343310 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343369 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343807 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.343804 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.373968 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") pod \"redhat-marketplace-gb92s\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.443111 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:29 crc kubenswrapper[5094]: I0220 10:07:29.913734 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.166436 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerStarted","Data":"b93f0081cd0b061954685743624431c955a43dc58ffb6a377a892f811f13381d"} Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.256990 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.305951 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-4vf4g"] Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.315993 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-4vf4g"] Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.362023 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") pod \"603fd649-921c-4ddc-aaea-ca487d399cdc\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.362467 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") pod \"603fd649-921c-4ddc-aaea-ca487d399cdc\" (UID: \"603fd649-921c-4ddc-aaea-ca487d399cdc\") " Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.362551 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host" (OuterVolumeSpecName: "host") pod "603fd649-921c-4ddc-aaea-ca487d399cdc" (UID: "603fd649-921c-4ddc-aaea-ca487d399cdc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.363127 5094 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/603fd649-921c-4ddc-aaea-ca487d399cdc-host\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.374788 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r" (OuterVolumeSpecName: "kube-api-access-rfx9r") pod "603fd649-921c-4ddc-aaea-ca487d399cdc" (UID: "603fd649-921c-4ddc-aaea-ca487d399cdc"). InnerVolumeSpecName "kube-api-access-rfx9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:30 crc kubenswrapper[5094]: I0220 10:07:30.464772 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfx9r\" (UniqueName: \"kubernetes.io/projected/603fd649-921c-4ddc-aaea-ca487d399cdc-kube-api-access-rfx9r\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.194176 5094 generic.go:334] "Generic (PLEG): container finished" podID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerID="4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0" exitCode=0 Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.194357 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerDied","Data":"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0"} Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.197116 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8e323d826373496a351adc42cad6ef3517155efe7a5cb6c8fc3f30807bc0181" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.197224 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-4vf4g" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.467571 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:31 crc kubenswrapper[5094]: E0220 10:07:31.468528 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="603fd649-921c-4ddc-aaea-ca487d399cdc" containerName="container-00" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.468552 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="603fd649-921c-4ddc-aaea-ca487d399cdc" containerName="container-00" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.468942 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="603fd649-921c-4ddc-aaea-ca487d399cdc" containerName="container-00" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.470915 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.480930 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-9mxxw"] Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.482812 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.494693 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601692 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601829 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601880 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601901 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.601920 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703390 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703503 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703543 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703572 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.703668 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.704125 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.704282 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.704557 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.723604 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") pod \"crc-debug-9mxxw\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.727438 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") pod \"redhat-operators-kbj5v\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.844575 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.858130 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:31 crc kubenswrapper[5094]: I0220 10:07:31.861762 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="603fd649-921c-4ddc-aaea-ca487d399cdc" path="/var/lib/kubelet/pods/603fd649-921c-4ddc-aaea-ca487d399cdc/volumes" Feb 20 10:07:31 crc kubenswrapper[5094]: W0220 10:07:31.942657 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb3c5186_6d33_4799_a3b9_22b0645e5a68.slice/crio-d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0 WatchSource:0}: Error finding container d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0: Status 404 returned error can't find the container with id d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0 Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.072810 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.076080 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.096787 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.213160 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" event={"ID":"cb3c5186-6d33-4799-a3b9-22b0645e5a68","Type":"ContainerStarted","Data":"d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0"} Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.213644 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.213943 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.214017 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.315979 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.316366 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.316425 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.316936 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.317030 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.336391 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") pod \"community-operators-9xvzd\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.383741 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:32 crc kubenswrapper[5094]: W0220 10:07:32.384028 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ceb59b7_8efc_478c_9663_ec454276c901.slice/crio-3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a WatchSource:0}: Error finding container 3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a: Status 404 returned error can't find the container with id 3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a Feb 20 10:07:32 crc kubenswrapper[5094]: I0220 10:07:32.435538 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.070436 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:33 crc kubenswrapper[5094]: W0220 10:07:33.073070 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43789bbd_d60e_4c83_96d4_83c2345aee73.slice/crio-750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394 WatchSource:0}: Error finding container 750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394: Status 404 returned error can't find the container with id 750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394 Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.225779 5094 generic.go:334] "Generic (PLEG): container finished" podID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerID="aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6" exitCode=0 Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.226101 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerDied","Data":"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.233658 5094 generic.go:334] "Generic (PLEG): container finished" podID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" containerID="13e5f052b4ba6e08f0990abbd5ddfbbbb9cd2d06486f28fe2908d667d1f8f224" exitCode=0 Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.233753 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" event={"ID":"cb3c5186-6d33-4799-a3b9-22b0645e5a68","Type":"ContainerDied","Data":"13e5f052b4ba6e08f0990abbd5ddfbbbb9cd2d06486f28fe2908d667d1f8f224"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.235257 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerStarted","Data":"750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.237258 5094 generic.go:334] "Generic (PLEG): container finished" podID="0ceb59b7-8efc-478c-9663-ec454276c901" containerID="1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479" exitCode=0 Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.237298 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerDied","Data":"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.237323 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerStarted","Data":"3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a"} Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.960351 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-9mxxw"] Feb 20 10:07:33 crc kubenswrapper[5094]: I0220 10:07:33.969874 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-9mxxw"] Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.255358 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerStarted","Data":"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1"} Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.258731 5094 generic.go:334] "Generic (PLEG): container finished" podID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerID="f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db" exitCode=0 Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.258814 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerDied","Data":"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db"} Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.321383 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gb92s" podStartSLOduration=2.9230023899999997 podStartE2EDuration="5.321361019s" podCreationTimestamp="2026-02-20 10:07:29 +0000 UTC" firstStartedPulling="2026-02-20 10:07:31.204421847 +0000 UTC m=+12066.077048588" lastFinishedPulling="2026-02-20 10:07:33.602780506 +0000 UTC m=+12068.475407217" observedRunningTime="2026-02-20 10:07:34.289737397 +0000 UTC m=+12069.162364108" watchObservedRunningTime="2026-02-20 10:07:34.321361019 +0000 UTC m=+12069.193987730" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.380228 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.561384 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") pod \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.561448 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") pod \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\" (UID: \"cb3c5186-6d33-4799-a3b9-22b0645e5a68\") " Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.561626 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host" (OuterVolumeSpecName: "host") pod "cb3c5186-6d33-4799-a3b9-22b0645e5a68" (UID: "cb3c5186-6d33-4799-a3b9-22b0645e5a68"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.562326 5094 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb3c5186-6d33-4799-a3b9-22b0645e5a68-host\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.568251 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh" (OuterVolumeSpecName: "kube-api-access-j74wh") pod "cb3c5186-6d33-4799-a3b9-22b0645e5a68" (UID: "cb3c5186-6d33-4799-a3b9-22b0645e5a68"). InnerVolumeSpecName "kube-api-access-j74wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:34 crc kubenswrapper[5094]: I0220 10:07:34.664815 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j74wh\" (UniqueName: \"kubernetes.io/projected/cb3c5186-6d33-4799-a3b9-22b0645e5a68-kube-api-access-j74wh\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.161491 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-m44jr"] Feb 20 10:07:35 crc kubenswrapper[5094]: E0220 10:07:35.162414 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" containerName="container-00" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.162438 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" containerName="container-00" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.162757 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" containerName="container-00" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.163680 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.269231 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-9mxxw" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.269244 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d669063eb640c76d98b7be5169f5a7cec929eb3ed015a6758ef8d4e4cca11ab0" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.271287 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerStarted","Data":"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93"} Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.276672 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerStarted","Data":"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c"} Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.278120 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.278307 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.380070 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.380200 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.380288 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.409328 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") pod \"crc-debug-m44jr\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.481544 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:35 crc kubenswrapper[5094]: I0220 10:07:35.858009 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb3c5186-6d33-4799-a3b9-22b0645e5a68" path="/var/lib/kubelet/pods/cb3c5186-6d33-4799-a3b9-22b0645e5a68/volumes" Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.296402 5094 generic.go:334] "Generic (PLEG): container finished" podID="3479144c-1430-4bb4-8013-8c8fa47e3c75" containerID="6915dff992e831dd1242868cc7a6d60444157071af3551c8f9494ba9a8bfdb74" exitCode=0 Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.297837 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" event={"ID":"3479144c-1430-4bb4-8013-8c8fa47e3c75","Type":"ContainerDied","Data":"6915dff992e831dd1242868cc7a6d60444157071af3551c8f9494ba9a8bfdb74"} Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.297937 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" event={"ID":"3479144c-1430-4bb4-8013-8c8fa47e3c75","Type":"ContainerStarted","Data":"84e21a0c9dbd3d1ca3086f37240de3faafc225e08e1d6f708168e851659f8d28"} Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.352690 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-m44jr"] Feb 20 10:07:36 crc kubenswrapper[5094]: I0220 10:07:36.364027 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pc5cx/crc-debug-m44jr"] Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.416565 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.530664 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") pod \"3479144c-1430-4bb4-8013-8c8fa47e3c75\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.530777 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host" (OuterVolumeSpecName: "host") pod "3479144c-1430-4bb4-8013-8c8fa47e3c75" (UID: "3479144c-1430-4bb4-8013-8c8fa47e3c75"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.530827 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") pod \"3479144c-1430-4bb4-8013-8c8fa47e3c75\" (UID: \"3479144c-1430-4bb4-8013-8c8fa47e3c75\") " Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.531306 5094 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3479144c-1430-4bb4-8013-8c8fa47e3c75-host\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.539012 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m" (OuterVolumeSpecName: "kube-api-access-vhm9m") pod "3479144c-1430-4bb4-8013-8c8fa47e3c75" (UID: "3479144c-1430-4bb4-8013-8c8fa47e3c75"). InnerVolumeSpecName "kube-api-access-vhm9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.633548 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhm9m\" (UniqueName: \"kubernetes.io/projected/3479144c-1430-4bb4-8013-8c8fa47e3c75-kube-api-access-vhm9m\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:37 crc kubenswrapper[5094]: I0220 10:07:37.854571 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3479144c-1430-4bb4-8013-8c8fa47e3c75" path="/var/lib/kubelet/pods/3479144c-1430-4bb4-8013-8c8fa47e3c75/volumes" Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.322643 5094 scope.go:117] "RemoveContainer" containerID="6915dff992e831dd1242868cc7a6d60444157071af3551c8f9494ba9a8bfdb74" Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.322658 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/crc-debug-m44jr" Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.327035 5094 generic.go:334] "Generic (PLEG): container finished" podID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerID="fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93" exitCode=0 Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.327091 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerDied","Data":"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93"} Feb 20 10:07:38 crc kubenswrapper[5094]: I0220 10:07:38.841484 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:38 crc kubenswrapper[5094]: E0220 10:07:38.842379 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.337975 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerStarted","Data":"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27"} Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.365302 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9xvzd" podStartSLOduration=2.850274258 podStartE2EDuration="7.365281592s" podCreationTimestamp="2026-02-20 10:07:32 +0000 UTC" firstStartedPulling="2026-02-20 10:07:34.264425826 +0000 UTC m=+12069.137052537" lastFinishedPulling="2026-02-20 10:07:38.77943312 +0000 UTC m=+12073.652059871" observedRunningTime="2026-02-20 10:07:39.356657314 +0000 UTC m=+12074.229284045" watchObservedRunningTime="2026-02-20 10:07:39.365281592 +0000 UTC m=+12074.237908303" Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.443650 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.443743 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:39 crc kubenswrapper[5094]: I0220 10:07:39.608981 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:40 crc kubenswrapper[5094]: I0220 10:07:40.351679 5094 generic.go:334] "Generic (PLEG): container finished" podID="0ceb59b7-8efc-478c-9663-ec454276c901" containerID="1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c" exitCode=0 Feb 20 10:07:40 crc kubenswrapper[5094]: I0220 10:07:40.352968 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerDied","Data":"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c"} Feb 20 10:07:40 crc kubenswrapper[5094]: I0220 10:07:40.417419 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.365785 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerStarted","Data":"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8"} Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.383436 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kbj5v" podStartSLOduration=2.8882224020000002 podStartE2EDuration="10.38341327s" podCreationTimestamp="2026-02-20 10:07:31 +0000 UTC" firstStartedPulling="2026-02-20 10:07:33.239059583 +0000 UTC m=+12068.111686294" lastFinishedPulling="2026-02-20 10:07:40.734250451 +0000 UTC m=+12075.606877162" observedRunningTime="2026-02-20 10:07:41.382072388 +0000 UTC m=+12076.254699109" watchObservedRunningTime="2026-02-20 10:07:41.38341327 +0000 UTC m=+12076.256039981" Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.861161 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.861204 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:41 crc kubenswrapper[5094]: I0220 10:07:41.861229 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:42 crc kubenswrapper[5094]: I0220 10:07:42.437251 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:42 crc kubenswrapper[5094]: I0220 10:07:42.437317 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:42 crc kubenswrapper[5094]: I0220 10:07:42.910872 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kbj5v" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" probeResult="failure" output=< Feb 20 10:07:42 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 10:07:42 crc kubenswrapper[5094]: > Feb 20 10:07:43 crc kubenswrapper[5094]: I0220 10:07:43.387644 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gb92s" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="registry-server" containerID="cri-o://9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" gracePeriod=2 Feb 20 10:07:43 crc kubenswrapper[5094]: I0220 10:07:43.493238 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9xvzd" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" probeResult="failure" output=< Feb 20 10:07:43 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 10:07:43 crc kubenswrapper[5094]: > Feb 20 10:07:43 crc kubenswrapper[5094]: I0220 10:07:43.961935 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.087954 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") pod \"324e88a2-c843-406e-a1c1-3bffb0b5a812\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.088088 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") pod \"324e88a2-c843-406e-a1c1-3bffb0b5a812\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.088255 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") pod \"324e88a2-c843-406e-a1c1-3bffb0b5a812\" (UID: \"324e88a2-c843-406e-a1c1-3bffb0b5a812\") " Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.088900 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities" (OuterVolumeSpecName: "utilities") pod "324e88a2-c843-406e-a1c1-3bffb0b5a812" (UID: "324e88a2-c843-406e-a1c1-3bffb0b5a812"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.094853 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9" (OuterVolumeSpecName: "kube-api-access-99qk9") pod "324e88a2-c843-406e-a1c1-3bffb0b5a812" (UID: "324e88a2-c843-406e-a1c1-3bffb0b5a812"). InnerVolumeSpecName "kube-api-access-99qk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.107330 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "324e88a2-c843-406e-a1c1-3bffb0b5a812" (UID: "324e88a2-c843-406e-a1c1-3bffb0b5a812"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.190776 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.190806 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/324e88a2-c843-406e-a1c1-3bffb0b5a812-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.190817 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99qk9\" (UniqueName: \"kubernetes.io/projected/324e88a2-c843-406e-a1c1-3bffb0b5a812-kube-api-access-99qk9\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.401780 5094 generic.go:334] "Generic (PLEG): container finished" podID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerID="9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" exitCode=0 Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.401876 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerDied","Data":"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1"} Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.402024 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gb92s" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.402145 5094 scope.go:117] "RemoveContainer" containerID="9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.402076 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gb92s" event={"ID":"324e88a2-c843-406e-a1c1-3bffb0b5a812","Type":"ContainerDied","Data":"b93f0081cd0b061954685743624431c955a43dc58ffb6a377a892f811f13381d"} Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.443338 5094 scope.go:117] "RemoveContainer" containerID="aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.468976 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.479064 5094 scope.go:117] "RemoveContainer" containerID="4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.483928 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gb92s"] Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.516768 5094 scope.go:117] "RemoveContainer" containerID="9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" Feb 20 10:07:44 crc kubenswrapper[5094]: E0220 10:07:44.517187 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1\": container with ID starting with 9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1 not found: ID does not exist" containerID="9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517235 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1"} err="failed to get container status \"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1\": rpc error: code = NotFound desc = could not find container \"9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1\": container with ID starting with 9b4e02c7cb62d486bed17d29c6f8ea969625d60af6266510d9e1e3f136feafa1 not found: ID does not exist" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517267 5094 scope.go:117] "RemoveContainer" containerID="aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6" Feb 20 10:07:44 crc kubenswrapper[5094]: E0220 10:07:44.517603 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6\": container with ID starting with aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6 not found: ID does not exist" containerID="aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517624 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6"} err="failed to get container status \"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6\": rpc error: code = NotFound desc = could not find container \"aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6\": container with ID starting with aa415c8e000608811f44abd799e97a6c4bc5672d7a931d59fcd905a05f2484a6 not found: ID does not exist" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517641 5094 scope.go:117] "RemoveContainer" containerID="4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0" Feb 20 10:07:44 crc kubenswrapper[5094]: E0220 10:07:44.517837 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0\": container with ID starting with 4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0 not found: ID does not exist" containerID="4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0" Feb 20 10:07:44 crc kubenswrapper[5094]: I0220 10:07:44.517858 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0"} err="failed to get container status \"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0\": rpc error: code = NotFound desc = could not find container \"4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0\": container with ID starting with 4bb3a37a2e4b96255d77a10488841b640f0e1ef7e0cf4f2be67173dbb605eff0 not found: ID does not exist" Feb 20 10:07:45 crc kubenswrapper[5094]: I0220 10:07:45.853015 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" path="/var/lib/kubelet/pods/324e88a2-c843-406e-a1c1-3bffb0b5a812/volumes" Feb 20 10:07:49 crc kubenswrapper[5094]: I0220 10:07:49.840650 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:07:49 crc kubenswrapper[5094]: E0220 10:07:49.841374 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:07:51 crc kubenswrapper[5094]: I0220 10:07:51.913050 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:51 crc kubenswrapper[5094]: I0220 10:07:51.967478 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:52 crc kubenswrapper[5094]: I0220 10:07:52.150820 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:52 crc kubenswrapper[5094]: I0220 10:07:52.496586 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:52 crc kubenswrapper[5094]: I0220 10:07:52.568898 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:53 crc kubenswrapper[5094]: I0220 10:07:53.493364 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kbj5v" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" containerID="cri-o://7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" gracePeriod=2 Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.061170 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.122963 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") pod \"0ceb59b7-8efc-478c-9663-ec454276c901\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.123070 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") pod \"0ceb59b7-8efc-478c-9663-ec454276c901\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.123090 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") pod \"0ceb59b7-8efc-478c-9663-ec454276c901\" (UID: \"0ceb59b7-8efc-478c-9663-ec454276c901\") " Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.124278 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities" (OuterVolumeSpecName: "utilities") pod "0ceb59b7-8efc-478c-9663-ec454276c901" (UID: "0ceb59b7-8efc-478c-9663-ec454276c901"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.133924 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9" (OuterVolumeSpecName: "kube-api-access-c78r9") pod "0ceb59b7-8efc-478c-9663-ec454276c901" (UID: "0ceb59b7-8efc-478c-9663-ec454276c901"). InnerVolumeSpecName "kube-api-access-c78r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.225021 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.225238 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c78r9\" (UniqueName: \"kubernetes.io/projected/0ceb59b7-8efc-478c-9663-ec454276c901-kube-api-access-c78r9\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.285948 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ceb59b7-8efc-478c-9663-ec454276c901" (UID: "0ceb59b7-8efc-478c-9663-ec454276c901"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.327371 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ceb59b7-8efc-478c-9663-ec454276c901-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507115 5094 generic.go:334] "Generic (PLEG): container finished" podID="0ceb59b7-8efc-478c-9663-ec454276c901" containerID="7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" exitCode=0 Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507155 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerDied","Data":"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8"} Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507436 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kbj5v" event={"ID":"0ceb59b7-8efc-478c-9663-ec454276c901","Type":"ContainerDied","Data":"3d888b35cb56fe1fa9ea75553edb55688480d55dec84f24f904b3addc94cb61a"} Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507462 5094 scope.go:117] "RemoveContainer" containerID="7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.507204 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kbj5v" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.532075 5094 scope.go:117] "RemoveContainer" containerID="1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.541455 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.558971 5094 scope.go:117] "RemoveContainer" containerID="1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.566943 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kbj5v"] Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.596700 5094 scope.go:117] "RemoveContainer" containerID="7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" Feb 20 10:07:54 crc kubenswrapper[5094]: E0220 10:07:54.597186 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8\": container with ID starting with 7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8 not found: ID does not exist" containerID="7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.597213 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8"} err="failed to get container status \"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8\": rpc error: code = NotFound desc = could not find container \"7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8\": container with ID starting with 7e42e835aeee4918acfaec9056623491679e7d0c0e4891e225083eb83fa5aff8 not found: ID does not exist" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.597234 5094 scope.go:117] "RemoveContainer" containerID="1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c" Feb 20 10:07:54 crc kubenswrapper[5094]: E0220 10:07:54.597584 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c\": container with ID starting with 1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c not found: ID does not exist" containerID="1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.597650 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c"} err="failed to get container status \"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c\": rpc error: code = NotFound desc = could not find container \"1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c\": container with ID starting with 1efc2287f83a0bace8a1700e2e0bf78dcad66e1696bfcec2f605a9a57249bb7c not found: ID does not exist" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.597676 5094 scope.go:117] "RemoveContainer" containerID="1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479" Feb 20 10:07:54 crc kubenswrapper[5094]: E0220 10:07:54.598312 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479\": container with ID starting with 1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479 not found: ID does not exist" containerID="1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.598337 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479"} err="failed to get container status \"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479\": rpc error: code = NotFound desc = could not find container \"1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479\": container with ID starting with 1d12c5f188d75bcb0a4383d1e6a5aa247e237d38d8898ba63562c9fa92ef0479 not found: ID does not exist" Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.748235 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:54 crc kubenswrapper[5094]: I0220 10:07:54.748689 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9xvzd" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" containerID="cri-o://1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" gracePeriod=2 Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.245879 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.347357 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") pod \"43789bbd-d60e-4c83-96d4-83c2345aee73\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.347422 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") pod \"43789bbd-d60e-4c83-96d4-83c2345aee73\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.347491 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") pod \"43789bbd-d60e-4c83-96d4-83c2345aee73\" (UID: \"43789bbd-d60e-4c83-96d4-83c2345aee73\") " Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.350183 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities" (OuterVolumeSpecName: "utilities") pod "43789bbd-d60e-4c83-96d4-83c2345aee73" (UID: "43789bbd-d60e-4c83-96d4-83c2345aee73"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.359998 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r" (OuterVolumeSpecName: "kube-api-access-rhn6r") pod "43789bbd-d60e-4c83-96d4-83c2345aee73" (UID: "43789bbd-d60e-4c83-96d4-83c2345aee73"). InnerVolumeSpecName "kube-api-access-rhn6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.402522 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "43789bbd-d60e-4c83-96d4-83c2345aee73" (UID: "43789bbd-d60e-4c83-96d4-83c2345aee73"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.450159 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.450623 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhn6r\" (UniqueName: \"kubernetes.io/projected/43789bbd-d60e-4c83-96d4-83c2345aee73-kube-api-access-rhn6r\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.450688 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/43789bbd-d60e-4c83-96d4-83c2345aee73-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520455 5094 generic.go:334] "Generic (PLEG): container finished" podID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerID="1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" exitCode=0 Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520508 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9xvzd" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520524 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerDied","Data":"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27"} Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520566 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9xvzd" event={"ID":"43789bbd-d60e-4c83-96d4-83c2345aee73","Type":"ContainerDied","Data":"750c9e2af19e95975eef09ae4a64b577b078248e036b359cd55dcbe9addcd394"} Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.520584 5094 scope.go:117] "RemoveContainer" containerID="1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.559346 5094 scope.go:117] "RemoveContainer" containerID="fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.574256 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.589123 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9xvzd"] Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.593057 5094 scope.go:117] "RemoveContainer" containerID="f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.623308 5094 scope.go:117] "RemoveContainer" containerID="1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" Feb 20 10:07:55 crc kubenswrapper[5094]: E0220 10:07:55.623838 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27\": container with ID starting with 1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27 not found: ID does not exist" containerID="1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.623974 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27"} err="failed to get container status \"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27\": rpc error: code = NotFound desc = could not find container \"1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27\": container with ID starting with 1832bf4907eae6f63efbbcc6531e6b2881234e3dac426afd8fb12dcd33c7cc27 not found: ID does not exist" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.624089 5094 scope.go:117] "RemoveContainer" containerID="fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93" Feb 20 10:07:55 crc kubenswrapper[5094]: E0220 10:07:55.624563 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93\": container with ID starting with fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93 not found: ID does not exist" containerID="fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.624670 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93"} err="failed to get container status \"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93\": rpc error: code = NotFound desc = could not find container \"fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93\": container with ID starting with fb56062dda1c7a3c5228b2bfae90e6ba2cd17b10ffd4f4a69e24518c8fd5ac93 not found: ID does not exist" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.624797 5094 scope.go:117] "RemoveContainer" containerID="f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db" Feb 20 10:07:55 crc kubenswrapper[5094]: E0220 10:07:55.625221 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db\": container with ID starting with f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db not found: ID does not exist" containerID="f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.625333 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db"} err="failed to get container status \"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db\": rpc error: code = NotFound desc = could not find container \"f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db\": container with ID starting with f8119439c03a485a5229d0e97986cc65db314d4847a7b6f720a044c8bc36d4db not found: ID does not exist" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.874828 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" path="/var/lib/kubelet/pods/0ceb59b7-8efc-478c-9663-ec454276c901/volumes" Feb 20 10:07:55 crc kubenswrapper[5094]: I0220 10:07:55.880128 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" path="/var/lib/kubelet/pods/43789bbd-d60e-4c83-96d4-83c2345aee73/volumes" Feb 20 10:08:00 crc kubenswrapper[5094]: I0220 10:08:00.840095 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:00 crc kubenswrapper[5094]: E0220 10:08:00.840674 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:08:14 crc kubenswrapper[5094]: I0220 10:08:14.840469 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:14 crc kubenswrapper[5094]: E0220 10:08:14.841296 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:08:26 crc kubenswrapper[5094]: I0220 10:08:26.840464 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:26 crc kubenswrapper[5094]: E0220 10:08:26.841383 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:08:37 crc kubenswrapper[5094]: I0220 10:08:37.840555 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:37 crc kubenswrapper[5094]: E0220 10:08:37.842733 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:08:48 crc kubenswrapper[5094]: I0220 10:08:48.841164 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:08:48 crc kubenswrapper[5094]: E0220 10:08:48.841931 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:09:02 crc kubenswrapper[5094]: I0220 10:09:02.840591 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:09:02 crc kubenswrapper[5094]: E0220 10:09:02.841389 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:09:16 crc kubenswrapper[5094]: I0220 10:09:16.841865 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:09:17 crc kubenswrapper[5094]: I0220 10:09:17.939017 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07"} Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.530861 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_c09fbf6b-1221-4e3d-b29d-6432848a564b/init-config-reloader/0.log" Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.732471 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_c09fbf6b-1221-4e3d-b29d-6432848a564b/alertmanager/0.log" Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.802568 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_c09fbf6b-1221-4e3d-b29d-6432848a564b/init-config-reloader/0.log" Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.810380 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_c09fbf6b-1221-4e3d-b29d-6432848a564b/config-reloader/0.log" Feb 20 10:11:04 crc kubenswrapper[5094]: I0220 10:11:04.942234 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_4f38802d-49cb-413c-ac61-665d5c77a1a3/aodh-api/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.006838 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_4f38802d-49cb-413c-ac61-665d5c77a1a3/aodh-listener/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.016628 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_4f38802d-49cb-413c-ac61-665d5c77a1a3/aodh-evaluator/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.021454 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_4f38802d-49cb-413c-ac61-665d5c77a1a3/aodh-notifier/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.235226 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69fdd7dd98-bm4fc_3e777e53-5dbe-4779-bc99-90bbf12cea8f/barbican-api-log/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.237112 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-69fdd7dd98-bm4fc_3e777e53-5dbe-4779-bc99-90bbf12cea8f/barbican-api/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.687900 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f6984ff88-5xqtx_b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a/barbican-keystone-listener/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.743400 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84b95bd745-mrk5m_13560fbf-48aa-45ac-8c10-067377d1adfa/barbican-worker/0.log" Feb 20 10:11:05 crc kubenswrapper[5094]: I0220 10:11:05.917129 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-84b95bd745-mrk5m_13560fbf-48aa-45ac-8c10-067377d1adfa/barbican-worker-log/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.073639 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-m7mmg_30a55d13-2efe-4d90-bcef-14aedc741079/bootstrap-openstack-openstack-cell1/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.286414 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-networker-452ln_e3baf01f-744b-44ed-b3c8-2ec288f77e59/bootstrap-openstack-openstack-networker/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.347964 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f6984ff88-5xqtx_b128e8c6-6bcb-4e4b-b648-d3f932ad0a0a/barbican-keystone-listener-log/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.426205 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f751c26-9b9c-4a25-a388-cc52b0934ab6/ceilometer-central-agent/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.511783 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f751c26-9b9c-4a25-a388-cc52b0934ab6/ceilometer-notification-agent/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.574501 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f751c26-9b9c-4a25-a388-cc52b0934ab6/proxy-httpd/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.600944 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_2f751c26-9b9c-4a25-a388-cc52b0934ab6/sg-core/0.log" Feb 20 10:11:06 crc kubenswrapper[5094]: I0220 10:11:06.709290 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-ffl97_e01778d5-c4a7-44c6-a9e9-cf7d3cb299db/ceph-client-openstack-openstack-cell1/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.037215 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b8551a6-6aac-4c12-b3ce-913397a5316f/cinder-api-log/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.125881 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3b8551a6-6aac-4c12-b3ce-913397a5316f/cinder-api/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.296153 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d7f13f97-3504-4faa-a8cf-8ad4a7973623/probe/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.436529 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_88808044-5011-40de-9088-154284495e1a/cinder-scheduler/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.570857 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_88808044-5011-40de-9088-154284495e1a/probe/0.log" Feb 20 10:11:07 crc kubenswrapper[5094]: I0220 10:11:07.874352 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_571a6098-6e30-438f-a6a9-fb751a79ca27/probe/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.129839 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-ggj5f_ea277d62-feb7-40a2-80a9-ad1a9d82cb13/configure-network-openstack-openstack-cell1/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.357072 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-networker-59p5g_de84413a-d424-4ec1-bb6d-e91b2278b854/configure-network-openstack-openstack-networker/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.477538 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_d7f13f97-3504-4faa-a8cf-8ad4a7973623/cinder-backup/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.628129 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-x6wr7_dd90b879-7bfd-480f-b25e-b7aef96a4b08/configure-os-openstack-openstack-cell1/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.766453 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-networker-ts6jc_9278a86a-be7e-4e04-a187-52d0c119ccb5/configure-os-openstack-openstack-networker/0.log" Feb 20 10:11:08 crc kubenswrapper[5094]: I0220 10:11:08.873662 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8479b9d65f-4mzqh_bee88947-a5ae-4438-9283-a3fc34fde9e4/init/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.112804 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8479b9d65f-4mzqh_bee88947-a5ae-4438-9283-a3fc34fde9e4/init/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.168882 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-4zfxb_ee932d3e-c52d-491d-92d8-8e21f7e1adbb/download-cache-openstack-openstack-cell1/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.313777 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-8479b9d65f-4mzqh_bee88947-a5ae-4438-9283-a3fc34fde9e4/dnsmasq-dns/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.358845 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-networker-s48qg_74755119-ad5b-439b-80bc-57779ffb5161/download-cache-openstack-openstack-networker/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.543225 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f67b4c32-25f3-4bc0-af69-ff9a9aa04404/glance-log/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.575426 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_f67b4c32-25f3-4bc0-af69-ff9a9aa04404/glance-httpd/0.log" Feb 20 10:11:09 crc kubenswrapper[5094]: I0220 10:11:09.673873 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_571a6098-6e30-438f-a6a9-fb751a79ca27/cinder-volume/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.421447 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9c121ca-4074-4775-a8e5-0c7f8a00ce22/glance-log/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.482189 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-68d6fbc7c5-czl7r_f4697fe9-ee95-4003-81d9-c6d7935b46cd/heat-engine/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.518109 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e9c121ca-4074-4775-a8e5-0c7f8a00ce22/glance-httpd/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.614159 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-76899f657-g7f8m_891348e7-69c8-46e3-a5c2-86c001574a89/heat-cfnapi/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.733688 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-6cc7f55d5c-lvdts_128b27b4-464a-4392-af17-51d79bdd1e1e/heat-api/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.792977 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85f686b8b5-kz5d4_dd051d85-41b3-420b-9999-5c9dee9aafe3/horizon/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.838245 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-85f686b8b5-kz5d4_dd051d85-41b3-420b-9999-5c9dee9aafe3/horizon-log/0.log" Feb 20 10:11:10 crc kubenswrapper[5094]: I0220 10:11:10.981853 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-8nqcz_8f8dd6dc-a03c-4873-8ecb-e23bc464edff/install-certs-openstack-openstack-cell1/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.004029 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-networker-xdx54_21d27f85-64a1-4dc5-af39-89275cce2427/install-certs-openstack-openstack-networker/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.097052 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-8j6nf_27e4bec3-7ef3-4f1d-897d-99909f817f5e/install-os-openstack-openstack-cell1/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.218371 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-networker-tqhtr_e890bf4c-20ec-4b45-936d-d08d3a73b5ee/install-os-openstack-openstack-networker/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.498035 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29526301-wg5rm_ba68c6b8-04f9-4515-8e85-3e7b4ca9615b/keystone-cron/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.544132 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29526361-srdg8_5a279e74-ee64-4ff9-8a0f-2700c30a770d/keystone-cron/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.697064 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_640e24e6-f89c-45ee-999a-e5aa0816aab2/kube-state-metrics/0.log" Feb 20 10:11:11 crc kubenswrapper[5094]: I0220 10:11:11.862820 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-f6rmf_a552adeb-5834-4cfe-8ee3-56472dda5cab/libvirt-openstack-openstack-cell1/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.188942 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6f444df446-vdhbp_167ab003-3908-4714-95b2-bfad7c1e1e00/keystone-api/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.299457 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_29ad14f6-de76-4992-b46a-29f0822654c7/probe/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.314786 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_29ad14f6-de76-4992-b46a-29f0822654c7/manila-scheduler/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.385498 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_fc6c3c7a-374b-49fc-98d5-852785c56ee7/manila-api/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.446644 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_fc6c3c7a-374b-49fc-98d5-852785c56ee7/manila-api-log/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.493417 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b/probe/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.521718 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_e7d29528-f8fb-4d9d-9fbd-e2eca7cd386b/manila-share/0.log" Feb 20 10:11:12 crc kubenswrapper[5094]: I0220 10:11:12.936850 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-g7hl9_f58790dc-4468-40ad-ba58-bb433a926abe/neutron-dhcp-openstack-openstack-cell1/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.046746 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-585ff4fdf7-llqts_78fff8ae-90d4-490d-b302-45fce0bd0101/neutron-httpd/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.149249 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-d2d4n_bf84fab1-aae6-4c92-982e-a4c5b1c7cefe/neutron-metadata-openstack-openstack-cell1/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.411975 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-585ff4fdf7-llqts_78fff8ae-90d4-490d-b302-45fce0bd0101/neutron-api/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.575958 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-networker-f5lls_b741c1f4-f408-486b-bd44-3ae1fcadc83b/neutron-metadata-openstack-openstack-networker/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.732779 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-6sqzq_6d3bf727-1eae-408c-be3d-2df97b387704/neutron-sriov-openstack-openstack-cell1/0.log" Feb 20 10:11:13 crc kubenswrapper[5094]: I0220 10:11:13.987303 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b6886613-4f07-498a-911f-4d77704ab4df/nova-api-api/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.168517 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_9ae582b9-3951-4670-91bf-5d044269ff1c/nova-cell0-conductor-conductor/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.313153 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b6886613-4f07-498a-911f-4d77704ab4df/nova-api-log/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.319147 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_23153570-19e2-4a29-9533-5db90a0c5d09/nova-cell1-conductor-conductor/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.524429 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_deb9f0ba-64bb-4eea-bcb3-34c371c8cdeb/nova-cell1-novncproxy-novncproxy/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.581811 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpr5gd_5b30b185-0b70-4ad8-8eca-a292b76fb410/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.763944 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-tclm2_086935dd-74d5-4657-a6a1-25bd11f6455f/nova-cell1-openstack-openstack-cell1/0.log" Feb 20 10:11:14 crc kubenswrapper[5094]: I0220 10:11:14.985960 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_87f99e6f-46a8-4a46-bcae-81947aa95700/nova-metadata-log/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.008759 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_87f99e6f-46a8-4a46-bcae-81947aa95700/nova-metadata-metadata/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.145220 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_67a3bd12-be26-46a3-bd66-982bea39049a/nova-scheduler-scheduler/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.195731 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98dd23d5-7a26-4a06-a35a-e818b8feba3c/mysql-bootstrap/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.381450 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98dd23d5-7a26-4a06-a35a-e818b8feba3c/mysql-bootstrap/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.400493 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_98dd23d5-7a26-4a06-a35a-e818b8feba3c/galera/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.508942 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_542d99bc-6049-42dc-9036-8a795552e896/mysql-bootstrap/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.632377 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_542d99bc-6049-42dc-9036-8a795552e896/mysql-bootstrap/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.659000 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_542d99bc-6049-42dc-9036-8a795552e896/galera/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.777314 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3c21f8d0-ca22-4206-9cdf-26edee70eac2/openstackclient/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.834939 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4bd6bf8e-8e67-4de1-a294-6b5d50f1797a/openstack-network-exporter/0.log" Feb 20 10:11:15 crc kubenswrapper[5094]: I0220 10:11:15.960745 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_4bd6bf8e-8e67-4de1-a294-6b5d50f1797a/ovn-northd/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.176052 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-pswsx_3f35b6d1-3070-44cf-bdf8-6376b2434586/ovn-openstack-openstack-cell1/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.356386 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_464f9fe3-85bf-4e78-adc3-3feedbaf1dac/openstack-network-exporter/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.372855 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-networker-r6vzk_3ef4f2ef-92a7-4d12-94a9-e3ee55412547/ovn-openstack-openstack-networker/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.431537 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_464f9fe3-85bf-4e78-adc3-3feedbaf1dac/ovsdbserver-nb/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.565693 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf/openstack-network-exporter/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.639094 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_5a5fd3fa-b5c3-4e02-a9e6-26be7e747baf/ovsdbserver-nb/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.813811 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_79dfde5a-85a9-437f-979d-1fdb99a1bb5f/openstack-network-exporter/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.876180 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_79dfde5a-85a9-437f-979d-1fdb99a1bb5f/ovsdbserver-nb/0.log" Feb 20 10:11:16 crc kubenswrapper[5094]: I0220 10:11:16.999280 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6ff74bc5-95bf-47fd-969e-cecbf1317e5d/openstack-network-exporter/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.041336 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_6ff74bc5-95bf-47fd-969e-cecbf1317e5d/ovsdbserver-sb/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.224030 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_daa57141-e76b-43a8-b363-2a1c7129d7c2/openstack-network-exporter/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.277241 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_daa57141-e76b-43a8-b363-2a1c7129d7c2/ovsdbserver-sb/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.380951 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9e6d0be3-167e-49e9-8450-a563f9115817/openstack-network-exporter/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.430381 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_9e6d0be3-167e-49e9-8450-a563f9115817/ovsdbserver-sb/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.833473 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64d8d4f69d-shjqs_cc482d5b-0b27-4293-b02b-7b02007cf790/placement-api/0.log" Feb 20 10:11:17 crc kubenswrapper[5094]: I0220 10:11:17.915794 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-64d8d4f69d-shjqs_cc482d5b-0b27-4293-b02b-7b02007cf790/placement-log/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.081273 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c6dx6m_aee17d13-1b0d-49a2-a515-cc63a2f62c63/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.122216 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-nc8v7b_750a5132-7613-40c0-a360-2f1a589d2554/pre-adoption-validation-openstack-pre-adoption-openstack-networ/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.258445 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/init-config-reloader/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.463490 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/init-config-reloader/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.491777 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/config-reloader/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.531182 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/prometheus/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.539719 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_22516d8a-bb80-405e-8258-01fd733495ef/thanos-sidecar/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.703633 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_392a6bbf-c80d-4142-adb2-b4828517b1c6/setup-container/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.904943 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_392a6bbf-c80d-4142-adb2-b4828517b1c6/setup-container/0.log" Feb 20 10:11:18 crc kubenswrapper[5094]: I0220 10:11:18.960340 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_392a6bbf-c80d-4142-adb2-b4828517b1c6/rabbitmq/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.033563 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b/setup-container/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.231032 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b/setup-container/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.292626 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-gpz7d_134b93a3-9f6d-41c3-ac1c-41dfc7e3bc0e/reboot-os-openstack-openstack-cell1/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.295776 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_fdb6b55c-4fa2-40c0-930f-5d3e7d03fb6b/rabbitmq/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.540410 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-networker-x57s5_5e673130-22d3-4300-a143-c2821deb8cac/reboot-os-openstack-openstack-networker/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.548006 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-fwcg9_73a5aa76-8e8f-4235-bd0d-294f718698fa/run-os-openstack-openstack-cell1/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.718814 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-networker-f78nb_95f054cd-db3e-45e0-9e12-55c2da3b5a23/run-os-openstack-openstack-networker/0.log" Feb 20 10:11:19 crc kubenswrapper[5094]: I0220 10:11:19.789255 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-zb287_387ade3f-0ebb-4488-8a04-389a018fc31d/ssh-known-hosts-openstack/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.070045 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-s5mln_a134a8f4-8450-4f7c-9988-11686cbdcd19/telemetry-openstack-openstack-cell1/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.207573 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_8e2aa894-2a09-4fad-bcc7-1f259ca48ac9/tempest-tests-tempest-tests-runner/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.278134 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_6798c144-2ada-4a54-98c4-72db0e7bd732/test-operator-logs-container/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.460366 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-p98s6_110791b2-a067-409d-9970-9db4868f0d4d/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.543162 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-networker-j2z5f_7894eb94-d4dd-4035-af5b-5994b4ae6d2f/tripleo-cleanup-tripleo-cleanup-openstack-networker/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.666689 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-k27rr_61a917df-8faa-482f-9582-0c5737301057/validate-network-openstack-openstack-cell1/0.log" Feb 20 10:11:20 crc kubenswrapper[5094]: I0220 10:11:20.807719 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-networker-2pr7n_f04426ab-50e7-4345-842b-69bfcc58207c/validate-network-openstack-openstack-networker/0.log" Feb 20 10:11:32 crc kubenswrapper[5094]: I0220 10:11:32.586130 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5074d037-240e-4685-8c3b-3dd7b963beb0/memcached/0.log" Feb 20 10:11:34 crc kubenswrapper[5094]: I0220 10:11:34.106336 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:11:34 crc kubenswrapper[5094]: I0220 10:11:34.106826 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:11:46 crc kubenswrapper[5094]: I0220 10:11:46.702155 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/util/0.log" Feb 20 10:11:46 crc kubenswrapper[5094]: I0220 10:11:46.927658 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/pull/0.log" Feb 20 10:11:46 crc kubenswrapper[5094]: I0220 10:11:46.933283 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/util/0.log" Feb 20 10:11:46 crc kubenswrapper[5094]: I0220 10:11:46.959952 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/pull/0.log" Feb 20 10:11:47 crc kubenswrapper[5094]: I0220 10:11:47.223525 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/extract/0.log" Feb 20 10:11:47 crc kubenswrapper[5094]: I0220 10:11:47.258258 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/pull/0.log" Feb 20 10:11:47 crc kubenswrapper[5094]: I0220 10:11:47.273381 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8f52c407bdc9ecc5c9ed04cde121370cff57ca187d042afc6ea79b7967f2qs9_9ec7e575-6ff8-40e7-8ec0-63a6ec7b7fe8/util/0.log" Feb 20 10:11:47 crc kubenswrapper[5094]: I0220 10:11:47.680440 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-xjng5_a91a9b82-fc6b-4900-becb-6dc3c100e429/manager/0.log" Feb 20 10:11:48 crc kubenswrapper[5094]: I0220 10:11:48.175152 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-26vtn_f6c8e20e-ecca-42d4-9e0e-5547ae567d9f/manager/0.log" Feb 20 10:11:48 crc kubenswrapper[5094]: I0220 10:11:48.306303 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-p689m_36d60210-52d5-4f28-ae0b-28cce632d5cb/manager/0.log" Feb 20 10:11:48 crc kubenswrapper[5094]: I0220 10:11:48.536672 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-mnd7v_292cb132-b03c-4d20-8bee-c90ad3c4486b/manager/0.log" Feb 20 10:11:49 crc kubenswrapper[5094]: I0220 10:11:49.042081 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-8j8pv_fcf15128-56ef-42dc-b230-1cd8b7638d33/manager/0.log" Feb 20 10:11:49 crc kubenswrapper[5094]: I0220 10:11:49.636098 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-nxtc7_eb67a9bc-35a6-4ce3-bca8-a08ee824cda7/manager/0.log" Feb 20 10:11:49 crc kubenswrapper[5094]: I0220 10:11:49.736190 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-86bxl_8a1c02cd-3546-45fa-b7db-5903c80681a4/manager/0.log" Feb 20 10:11:50 crc kubenswrapper[5094]: I0220 10:11:50.003478 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-hfkff_b863d4f9-063a-4102-8c3d-f7e092e4e2c0/manager/0.log" Feb 20 10:11:50 crc kubenswrapper[5094]: I0220 10:11:50.326883 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-n5dgn_6177108e-bc02-497c-80ab-312f61fbd1c2/manager/0.log" Feb 20 10:11:50 crc kubenswrapper[5094]: I0220 10:11:50.686040 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-bd9tr_eba1b8e0-b529-47ad-a657-75ce01bad56a/manager/0.log" Feb 20 10:11:51 crc kubenswrapper[5094]: I0220 10:11:51.390142 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-2ftdz_93dbc041-00c2-4189-abca-6bb3a00abc2d/manager/0.log" Feb 20 10:11:51 crc kubenswrapper[5094]: I0220 10:11:51.632013 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-fb5fcc5b8-tsrv8_57b4cb2c-e7bc-4430-bfb8-3642dab61d84/manager/0.log" Feb 20 10:11:52 crc kubenswrapper[5094]: I0220 10:11:52.182146 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6679bf9b57-9glnw_234632e4-6191-4ec8-94c5-c93d71c13ad0/operator/0.log" Feb 20 10:11:52 crc kubenswrapper[5094]: I0220 10:11:52.987303 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-72vrj_be2cc842-778e-4963-80f8-bb5c7426f175/registry-server/0.log" Feb 20 10:11:53 crc kubenswrapper[5094]: I0220 10:11:53.532003 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-ct2h7_74861845-de37-4091-9226-bcb1bbe64b35/manager/0.log" Feb 20 10:11:53 crc kubenswrapper[5094]: I0220 10:11:53.588757 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-24cv7_32338b54-c33f-4dc5-b328-9cf4d92d1db6/manager/0.log" Feb 20 10:11:53 crc kubenswrapper[5094]: I0220 10:11:53.809304 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-9c2k5_c510ecc1-53ce-4611-af6a-09488f9317ed/manager/0.log" Feb 20 10:11:54 crc kubenswrapper[5094]: I0220 10:11:54.091057 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-fq9n6_f45a4211-8890-4e4a-af96-ccffec62160c/operator/0.log" Feb 20 10:11:54 crc kubenswrapper[5094]: I0220 10:11:54.306637 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-dh48q_a0d3c29b-4f57-4647-b1a5-bfd6c887b0b5/manager/0.log" Feb 20 10:11:54 crc kubenswrapper[5094]: I0220 10:11:54.781669 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-cbtkd_c45dcc1f-a95d-4492-9139-16d550809a8e/manager/0.log" Feb 20 10:11:54 crc kubenswrapper[5094]: I0220 10:11:54.790612 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-nll74_fce6c9b3-2075-479d-9a16-738831a871c4/manager/0.log" Feb 20 10:11:55 crc kubenswrapper[5094]: I0220 10:11:55.071485 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-lz57q_4413dc36-58b0-447a-ba69-cdd2cee9589c/manager/0.log" Feb 20 10:11:56 crc kubenswrapper[5094]: I0220 10:11:56.899900 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-69ff7bc449-57z9v_a1b74404-906b-4466-a3bd-289458ef90ea/manager/0.log" Feb 20 10:11:57 crc kubenswrapper[5094]: I0220 10:11:57.866548 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-k5dkn_6b09cc76-8cba-42ed-bb2c-fdf4473c9afe/manager/0.log" Feb 20 10:11:57 crc kubenswrapper[5094]: I0220 10:11:57.914950 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-ngkcq_683351ac-f508-4961-b07a-eaac9c26a4f3/manager/0.log" Feb 20 10:12:04 crc kubenswrapper[5094]: I0220 10:12:04.107612 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:12:04 crc kubenswrapper[5094]: I0220 10:12:04.108448 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:12:17 crc kubenswrapper[5094]: I0220 10:12:17.328246 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-znrdm_38d9642e-3788-4e70-8232-138cd84e02dc/control-plane-machine-set-operator/0.log" Feb 20 10:12:17 crc kubenswrapper[5094]: I0220 10:12:17.462697 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ps6pv_2f348b60-0d81-490e-bfb4-ea32546c995a/kube-rbac-proxy/0.log" Feb 20 10:12:17 crc kubenswrapper[5094]: I0220 10:12:17.526321 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-ps6pv_2f348b60-0d81-490e-bfb4-ea32546c995a/machine-api-operator/0.log" Feb 20 10:12:31 crc kubenswrapper[5094]: I0220 10:12:31.054958 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-pdnlx_d6360113-cdd8-48a4-a145-4b54eb5510eb/cert-manager-controller/0.log" Feb 20 10:12:31 crc kubenswrapper[5094]: I0220 10:12:31.301737 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-mtw89_34f53f0e-6a22-42c9-a953-3ec38e87a70f/cert-manager-cainjector/0.log" Feb 20 10:12:31 crc kubenswrapper[5094]: I0220 10:12:31.382260 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-sxrw7_bc1f2312-eb97-4f63-b37b-975d9dfb5a73/cert-manager-webhook/0.log" Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.106886 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.107444 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.107524 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.109094 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.109275 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07" gracePeriod=600 Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.367041 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07" exitCode=0 Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.367120 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07"} Feb 20 10:12:34 crc kubenswrapper[5094]: I0220 10:12:34.367286 5094 scope.go:117] "RemoveContainer" containerID="8e201ba8bb10d21eebd785b27460470c707ecf1528e95958708a42491b8bcaaf" Feb 20 10:12:35 crc kubenswrapper[5094]: I0220 10:12:35.379438 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512"} Feb 20 10:12:44 crc kubenswrapper[5094]: I0220 10:12:44.789491 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-74czm_edd001fc-3ddc-4010-8a98-54f4ffeaba72/nmstate-console-plugin/0.log" Feb 20 10:12:44 crc kubenswrapper[5094]: I0220 10:12:44.974716 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jr284_55b1a421-7ec5-4442-b4c5-11767715cc4b/nmstate-handler/0.log" Feb 20 10:12:45 crc kubenswrapper[5094]: I0220 10:12:45.028977 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-gvgqm_93238aee-86f0-497a-8880-531338e8245f/kube-rbac-proxy/0.log" Feb 20 10:12:45 crc kubenswrapper[5094]: I0220 10:12:45.088823 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-gvgqm_93238aee-86f0-497a-8880-531338e8245f/nmstate-metrics/0.log" Feb 20 10:12:45 crc kubenswrapper[5094]: I0220 10:12:45.188141 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-qg9ms_6804a7c3-a0d7-46d4-b317-e9c54265841e/nmstate-operator/0.log" Feb 20 10:12:45 crc kubenswrapper[5094]: I0220 10:12:45.267965 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-frvdg_df45fab4-d183-4702-b5b6-2a4e559eff22/nmstate-webhook/0.log" Feb 20 10:12:46 crc kubenswrapper[5094]: I0220 10:12:46.369050 5094 scope.go:117] "RemoveContainer" containerID="5e963cdb7c15a481066283be89f67b32c2bfd26d63091f9de055d580de5cc75f" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.068772 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kxcc8_5a9736b1-aca8-4880-9d94-2d7c37efce50/prometheus-operator/0.log" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.219616 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-56764c7d84-47vjf_b0fb9831-f265-4976-9a1d-14ed3e08daf5/prometheus-operator-admission-webhook/0.log" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.313994 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-56764c7d84-d7v85_c70d95ea-5321-43fa-8df8-6d1138f0a732/prometheus-operator-admission-webhook/0.log" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.425030 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-dcm8l_724c1050-e6d7-49c3-8b63-a89a3de26894/operator/0.log" Feb 20 10:13:00 crc kubenswrapper[5094]: I0220 10:13:00.485148 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-9hzgx_1ab531ae-b53c-4de1-b927-ca32c159c244/perses-operator/0.log" Feb 20 10:13:15 crc kubenswrapper[5094]: I0220 10:13:15.535495 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-s7ndd_2a03b7d3-8e22-4a62-98f0-8d72500fab69/kube-rbac-proxy/0.log" Feb 20 10:13:15 crc kubenswrapper[5094]: I0220 10:13:15.836837 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-s7ndd_2a03b7d3-8e22-4a62-98f0-8d72500fab69/controller/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.112228 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-frr-files/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.238317 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-reloader/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.283728 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-frr-files/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.338807 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-metrics/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.349479 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-reloader/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.564079 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-frr-files/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.583591 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-reloader/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.615512 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-metrics/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.661206 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-metrics/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.815360 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-metrics/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.821668 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-reloader/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.867557 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/controller/0.log" Feb 20 10:13:16 crc kubenswrapper[5094]: I0220 10:13:16.899027 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/cp-frr-files/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.016000 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/frr-metrics/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.064081 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/kube-rbac-proxy/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.121758 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/kube-rbac-proxy-frr/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.266793 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/reloader/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.393968 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-dljs6_fe469d05-edeb-4d23-b06b-6bdbfc646e99/frr-k8s-webhook-server/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.579629 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6d969f468d-fd5gv_059e3724-d657-4f2e-beec-f4f55e09e498/manager/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.708931 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-c5fbff78-jk6cf_2dde7604-2a93-4dc0-9b15-b8fe41f79e1e/webhook-server/0.log" Feb 20 10:13:17 crc kubenswrapper[5094]: I0220 10:13:17.975045 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gjp5f_4d145cb8-0c5c-40f7-a99c-15f1575629c3/kube-rbac-proxy/0.log" Feb 20 10:13:18 crc kubenswrapper[5094]: I0220 10:13:18.762242 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gjp5f_4d145cb8-0c5c-40f7-a99c-15f1575629c3/speaker/0.log" Feb 20 10:13:20 crc kubenswrapper[5094]: I0220 10:13:20.277005 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-4p57m_f065adc1-f6c1-4895-a933-906a708555c1/frr/0.log" Feb 20 10:13:33 crc kubenswrapper[5094]: I0220 10:13:33.644032 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/util/0.log" Feb 20 10:13:33 crc kubenswrapper[5094]: I0220 10:13:33.845215 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/util/0.log" Feb 20 10:13:33 crc kubenswrapper[5094]: I0220 10:13:33.914430 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/pull/0.log" Feb 20 10:13:33 crc kubenswrapper[5094]: I0220 10:13:33.915538 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.123820 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.134364 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/util/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.157447 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c2r26_67055673-f25d-44d3-99e5-2ac1474b1872/extract/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.300730 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/util/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.488880 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/util/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.505529 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.520414 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.695550 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/util/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.698734 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/pull/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.730795 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087sxzz_77007c08-6c58-4a19-9c49-09c1677b9070/extract/0.log" Feb 20 10:13:34 crc kubenswrapper[5094]: I0220 10:13:34.892472 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/util/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.068215 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/pull/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.078690 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/util/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.080574 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/pull/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.291272 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/pull/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.340992 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/util/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.342006 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213ln4zx_4ffda3d5-82a2-4a0c-9052-2546188c107a/extract/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.468687 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-utilities/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.666434 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-utilities/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.700417 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-content/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.705937 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-content/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.879588 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-utilities/0.log" Feb 20 10:13:35 crc kubenswrapper[5094]: I0220 10:13:35.894630 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/extract-content/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.159967 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-utilities/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.245259 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qk79m_67d20448-086a-4d76-b547-768f68c018f2/registry-server/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.371590 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-content/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.376103 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-content/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.379401 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-utilities/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.613075 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-content/0.log" Feb 20 10:13:36 crc kubenswrapper[5094]: I0220 10:13:36.699046 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/extract-utilities/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.204066 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/util/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.561518 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/util/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.581248 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bhkm8_966d704a-5474-4b23-b125-63789f45ee54/registry-server/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.595283 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/pull/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.605422 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/pull/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.796203 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/util/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.800992 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/pull/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.822906 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecacvkxr_b191dcbc-7e3b-4a36-a913-1fe0f53c83d5/extract/0.log" Feb 20 10:13:37 crc kubenswrapper[5094]: I0220 10:13:37.833608 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-j8j9k_a8e1ef0a-5b2f-42b8-a3aa-18aa22e560e9/marketplace-operator/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.005250 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.154321 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.198953 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.203244 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.367044 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.441138 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.464590 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.677786 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.718466 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.767818 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-content/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.800775 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zw5zl_4ba2a013-0ac4-4983-92e6-875272450307/registry-server/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.866131 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-utilities/0.log" Feb 20 10:13:38 crc kubenswrapper[5094]: I0220 10:13:38.890437 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/extract-content/0.log" Feb 20 10:13:40 crc kubenswrapper[5094]: I0220 10:13:40.149004 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-qcxs4_447e3a00-67d2-44c4-89cd-def383a3693d/registry-server/0.log" Feb 20 10:13:46 crc kubenswrapper[5094]: I0220 10:13:46.419693 5094 scope.go:117] "RemoveContainer" containerID="13e5f052b4ba6e08f0990abbd5ddfbbbb9cd2d06486f28fe2908d667d1f8f224" Feb 20 10:13:52 crc kubenswrapper[5094]: I0220 10:13:52.939466 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-kxcc8_5a9736b1-aca8-4880-9d94-2d7c37efce50/prometheus-operator/0.log" Feb 20 10:13:52 crc kubenswrapper[5094]: I0220 10:13:52.975501 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-56764c7d84-47vjf_b0fb9831-f265-4976-9a1d-14ed3e08daf5/prometheus-operator-admission-webhook/0.log" Feb 20 10:13:53 crc kubenswrapper[5094]: I0220 10:13:53.057050 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-56764c7d84-d7v85_c70d95ea-5321-43fa-8df8-6d1138f0a732/prometheus-operator-admission-webhook/0.log" Feb 20 10:13:53 crc kubenswrapper[5094]: I0220 10:13:53.227050 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-9hzgx_1ab531ae-b53c-4de1-b927-ca32c159c244/perses-operator/0.log" Feb 20 10:13:53 crc kubenswrapper[5094]: I0220 10:13:53.242872 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-dcm8l_724c1050-e6d7-49c3-8b63-a89a3de26894/operator/0.log" Feb 20 10:14:34 crc kubenswrapper[5094]: I0220 10:14:34.106454 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:14:34 crc kubenswrapper[5094]: I0220 10:14:34.107054 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.176423 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq"] Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178067 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178089 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178118 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178127 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178146 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3479144c-1430-4bb4-8013-8c8fa47e3c75" containerName="container-00" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178155 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="3479144c-1430-4bb4-8013-8c8fa47e3c75" containerName="container-00" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178171 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178184 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178216 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178225 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178245 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178254 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178281 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178289 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="extract-content" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178306 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178314 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178328 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178337 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="extract-utilities" Feb 20 10:15:00 crc kubenswrapper[5094]: E0220 10:15:00.178354 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178363 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178662 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="324e88a2-c843-406e-a1c1-3bffb0b5a812" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178685 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="43789bbd-d60e-4c83-96d4-83c2345aee73" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178745 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ceb59b7-8efc-478c-9663-ec454276c901" containerName="registry-server" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.178763 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="3479144c-1430-4bb4-8013-8c8fa47e3c75" containerName="container-00" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.180120 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.187600 5094 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.187900 5094 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.209805 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq"] Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.357034 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.357575 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.357665 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.459650 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.459845 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.460040 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.462108 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.473718 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.477059 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") pod \"collect-profiles-29526375-m4wtq\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.529413 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:00 crc kubenswrapper[5094]: W0220 10:15:00.984195 5094 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc4a9549_d420_4188_83ab_110e9585ad99.slice/crio-afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b WatchSource:0}: Error finding container afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b: Status 404 returned error can't find the container with id afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b Feb 20 10:15:00 crc kubenswrapper[5094]: I0220 10:15:00.984928 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq"] Feb 20 10:15:01 crc kubenswrapper[5094]: I0220 10:15:01.964914 5094 generic.go:334] "Generic (PLEG): container finished" podID="dc4a9549-d420-4188-83ab-110e9585ad99" containerID="cff1e3645b1be65b4ed72cc908b8dac51b9ad71ef96dfbdb704c6d7b943c3aa3" exitCode=0 Feb 20 10:15:01 crc kubenswrapper[5094]: I0220 10:15:01.965399 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" event={"ID":"dc4a9549-d420-4188-83ab-110e9585ad99","Type":"ContainerDied","Data":"cff1e3645b1be65b4ed72cc908b8dac51b9ad71ef96dfbdb704c6d7b943c3aa3"} Feb 20 10:15:01 crc kubenswrapper[5094]: I0220 10:15:01.965751 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" event={"ID":"dc4a9549-d420-4188-83ab-110e9585ad99","Type":"ContainerStarted","Data":"afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b"} Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.432250 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.531078 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") pod \"dc4a9549-d420-4188-83ab-110e9585ad99\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.531126 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") pod \"dc4a9549-d420-4188-83ab-110e9585ad99\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.531295 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") pod \"dc4a9549-d420-4188-83ab-110e9585ad99\" (UID: \"dc4a9549-d420-4188-83ab-110e9585ad99\") " Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.532039 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume" (OuterVolumeSpecName: "config-volume") pod "dc4a9549-d420-4188-83ab-110e9585ad99" (UID: "dc4a9549-d420-4188-83ab-110e9585ad99"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.536999 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd" (OuterVolumeSpecName: "kube-api-access-lstjd") pod "dc4a9549-d420-4188-83ab-110e9585ad99" (UID: "dc4a9549-d420-4188-83ab-110e9585ad99"). InnerVolumeSpecName "kube-api-access-lstjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.537042 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dc4a9549-d420-4188-83ab-110e9585ad99" (UID: "dc4a9549-d420-4188-83ab-110e9585ad99"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.633591 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lstjd\" (UniqueName: \"kubernetes.io/projected/dc4a9549-d420-4188-83ab-110e9585ad99-kube-api-access-lstjd\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.633855 5094 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dc4a9549-d420-4188-83ab-110e9585ad99-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.633865 5094 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc4a9549-d420-4188-83ab-110e9585ad99-config-volume\") on node \"crc\" DevicePath \"\"" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.985775 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" event={"ID":"dc4a9549-d420-4188-83ab-110e9585ad99","Type":"ContainerDied","Data":"afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b"} Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.985814 5094 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afc298955973815edba6a67f548b7f02f0a2e45b8ac971fc904bd9d29c57d05b" Feb 20 10:15:03 crc kubenswrapper[5094]: I0220 10:15:03.985859 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29526375-m4wtq" Feb 20 10:15:04 crc kubenswrapper[5094]: I0220 10:15:04.107002 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:15:04 crc kubenswrapper[5094]: I0220 10:15:04.107112 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:15:04 crc kubenswrapper[5094]: I0220 10:15:04.525552 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 10:15:04 crc kubenswrapper[5094]: I0220 10:15:04.540733 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29526330-4dpg7"] Feb 20 10:15:05 crc kubenswrapper[5094]: I0220 10:15:05.863855 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06092367-1969-4b35-8025-09e5a52a5855" path="/var/lib/kubelet/pods/06092367-1969-4b35-8025-09e5a52a5855/volumes" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.107307 5094 patch_prober.go:28] interesting pod/machine-config-daemon-56ppq container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.108017 5094 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.108080 5094 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.109210 5094 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512"} pod="openshift-machine-config-operator/machine-config-daemon-56ppq" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.109315 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerName="machine-config-daemon" containerID="cri-o://c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" gracePeriod=600 Feb 20 10:15:34 crc kubenswrapper[5094]: E0220 10:15:34.239424 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.389556 5094 generic.go:334] "Generic (PLEG): container finished" podID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" exitCode=0 Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.389603 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerDied","Data":"c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512"} Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.389636 5094 scope.go:117] "RemoveContainer" containerID="314cf401191c15263264dcbedb42c99f095a2284b06c68e30858221c2e104a07" Feb 20 10:15:34 crc kubenswrapper[5094]: I0220 10:15:34.390428 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:15:34 crc kubenswrapper[5094]: E0220 10:15:34.390864 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:15:46 crc kubenswrapper[5094]: I0220 10:15:46.516959 5094 scope.go:117] "RemoveContainer" containerID="4458d3e89efbd0e5ea42a99c4b47f135cba67a66cbdaaf49efb55576b8dd1322" Feb 20 10:15:46 crc kubenswrapper[5094]: I0220 10:15:46.840350 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:15:46 crc kubenswrapper[5094]: E0220 10:15:46.840825 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:01 crc kubenswrapper[5094]: I0220 10:16:01.840759 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:01 crc kubenswrapper[5094]: E0220 10:16:01.841875 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.918177 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:05 crc kubenswrapper[5094]: E0220 10:16:05.920414 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc4a9549-d420-4188-83ab-110e9585ad99" containerName="collect-profiles" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.920527 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc4a9549-d420-4188-83ab-110e9585ad99" containerName="collect-profiles" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.920897 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc4a9549-d420-4188-83ab-110e9585ad99" containerName="collect-profiles" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.922829 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:05 crc kubenswrapper[5094]: I0220 10:16:05.934378 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.060249 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.060311 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.060377 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.162591 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.162964 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.163143 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.163273 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.163396 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.198221 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") pod \"certified-operators-7vj9d\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.244067 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:06 crc kubenswrapper[5094]: I0220 10:16:06.833137 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:07 crc kubenswrapper[5094]: I0220 10:16:07.837296 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerID="0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c" exitCode=0 Feb 20 10:16:07 crc kubenswrapper[5094]: I0220 10:16:07.837431 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerDied","Data":"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c"} Feb 20 10:16:07 crc kubenswrapper[5094]: I0220 10:16:07.838041 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerStarted","Data":"82aca659b94fe1e7677dac1deb6868541b3c2c79dd885663bc3a1b1e828c3b58"} Feb 20 10:16:07 crc kubenswrapper[5094]: I0220 10:16:07.860460 5094 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 20 10:16:08 crc kubenswrapper[5094]: I0220 10:16:08.852952 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerStarted","Data":"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3"} Feb 20 10:16:09 crc kubenswrapper[5094]: I0220 10:16:09.865738 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerID="9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3" exitCode=0 Feb 20 10:16:09 crc kubenswrapper[5094]: I0220 10:16:09.865912 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerDied","Data":"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3"} Feb 20 10:16:10 crc kubenswrapper[5094]: I0220 10:16:10.884346 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerStarted","Data":"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10"} Feb 20 10:16:10 crc kubenswrapper[5094]: I0220 10:16:10.917311 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7vj9d" podStartSLOduration=3.505244887 podStartE2EDuration="5.91728777s" podCreationTimestamp="2026-02-20 10:16:05 +0000 UTC" firstStartedPulling="2026-02-20 10:16:07.853911063 +0000 UTC m=+12582.726537774" lastFinishedPulling="2026-02-20 10:16:10.265953946 +0000 UTC m=+12585.138580657" observedRunningTime="2026-02-20 10:16:10.908487968 +0000 UTC m=+12585.781114699" watchObservedRunningTime="2026-02-20 10:16:10.91728777 +0000 UTC m=+12585.789914481" Feb 20 10:16:13 crc kubenswrapper[5094]: I0220 10:16:13.841361 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:13 crc kubenswrapper[5094]: E0220 10:16:13.842229 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:16 crc kubenswrapper[5094]: I0220 10:16:16.244742 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:16 crc kubenswrapper[5094]: I0220 10:16:16.245166 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:16 crc kubenswrapper[5094]: I0220 10:16:16.328094 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:17 crc kubenswrapper[5094]: I0220 10:16:17.027963 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:17 crc kubenswrapper[5094]: I0220 10:16:17.119265 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:18 crc kubenswrapper[5094]: I0220 10:16:18.983108 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7vj9d" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="registry-server" containerID="cri-o://15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" gracePeriod=2 Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.574634 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.578224 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") pod \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.579246 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities" (OuterVolumeSpecName: "utilities") pod "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" (UID: "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.580880 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") pod \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.581052 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") pod \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\" (UID: \"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b\") " Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.581944 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.590691 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh" (OuterVolumeSpecName: "kube-api-access-9rwrh") pod "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" (UID: "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b"). InnerVolumeSpecName "kube-api-access-9rwrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.646382 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" (UID: "e9c6363e-df27-4cdc-bdd1-26daff7ceb4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.684424 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rwrh\" (UniqueName: \"kubernetes.io/projected/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-kube-api-access-9rwrh\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.684453 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995473 5094 generic.go:334] "Generic (PLEG): container finished" podID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerID="15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" exitCode=0 Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995512 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerDied","Data":"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10"} Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995554 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7vj9d" Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995579 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7vj9d" event={"ID":"e9c6363e-df27-4cdc-bdd1-26daff7ceb4b","Type":"ContainerDied","Data":"82aca659b94fe1e7677dac1deb6868541b3c2c79dd885663bc3a1b1e828c3b58"} Feb 20 10:16:19 crc kubenswrapper[5094]: I0220 10:16:19.995614 5094 scope.go:117] "RemoveContainer" containerID="15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.028370 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.037874 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7vj9d"] Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.038442 5094 scope.go:117] "RemoveContainer" containerID="9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.075748 5094 scope.go:117] "RemoveContainer" containerID="0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.102205 5094 scope.go:117] "RemoveContainer" containerID="15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" Feb 20 10:16:20 crc kubenswrapper[5094]: E0220 10:16:20.102883 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10\": container with ID starting with 15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10 not found: ID does not exist" containerID="15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.102926 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10"} err="failed to get container status \"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10\": rpc error: code = NotFound desc = could not find container \"15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10\": container with ID starting with 15a9e66cb3931668f0956a87a42fa98bb27623223549898f4dd9f6b95305dd10 not found: ID does not exist" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.102950 5094 scope.go:117] "RemoveContainer" containerID="9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3" Feb 20 10:16:20 crc kubenswrapper[5094]: E0220 10:16:20.103288 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3\": container with ID starting with 9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3 not found: ID does not exist" containerID="9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.103316 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3"} err="failed to get container status \"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3\": rpc error: code = NotFound desc = could not find container \"9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3\": container with ID starting with 9ab342a262a061dd56324fe9dee2bd698ad36dc46c7a79f8999c6b2e08d81cf3 not found: ID does not exist" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.103332 5094 scope.go:117] "RemoveContainer" containerID="0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c" Feb 20 10:16:20 crc kubenswrapper[5094]: E0220 10:16:20.103534 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c\": container with ID starting with 0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c not found: ID does not exist" containerID="0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c" Feb 20 10:16:20 crc kubenswrapper[5094]: I0220 10:16:20.103563 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c"} err="failed to get container status \"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c\": rpc error: code = NotFound desc = could not find container \"0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c\": container with ID starting with 0ddc4eebe69944fe8ec19dc873de312c8f3a9f21c3e745d0f5a98f2fd6c0756c not found: ID does not exist" Feb 20 10:16:21 crc kubenswrapper[5094]: I0220 10:16:21.870520 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" path="/var/lib/kubelet/pods/e9c6363e-df27-4cdc-bdd1-26daff7ceb4b/volumes" Feb 20 10:16:26 crc kubenswrapper[5094]: I0220 10:16:26.840476 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:26 crc kubenswrapper[5094]: E0220 10:16:26.841540 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:38 crc kubenswrapper[5094]: I0220 10:16:38.841303 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:38 crc kubenswrapper[5094]: E0220 10:16:38.842547 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:16:51 crc kubenswrapper[5094]: I0220 10:16:51.841377 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:16:51 crc kubenswrapper[5094]: E0220 10:16:51.842273 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:02 crc kubenswrapper[5094]: I0220 10:17:02.840449 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:02 crc kubenswrapper[5094]: E0220 10:17:02.841285 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:13 crc kubenswrapper[5094]: I0220 10:17:13.841337 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:13 crc kubenswrapper[5094]: E0220 10:17:13.842358 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:27 crc kubenswrapper[5094]: I0220 10:17:27.846951 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:27 crc kubenswrapper[5094]: E0220 10:17:27.849743 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:32 crc kubenswrapper[5094]: I0220 10:17:32.885075 5094 generic.go:334] "Generic (PLEG): container finished" podID="2578360e-4830-4223-b07f-031c6c2df11e" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" exitCode=0 Feb 20 10:17:32 crc kubenswrapper[5094]: I0220 10:17:32.885136 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" event={"ID":"2578360e-4830-4223-b07f-031c6c2df11e","Type":"ContainerDied","Data":"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899"} Feb 20 10:17:32 crc kubenswrapper[5094]: I0220 10:17:32.886252 5094 scope.go:117] "RemoveContainer" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" Feb 20 10:17:33 crc kubenswrapper[5094]: I0220 10:17:33.422272 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pc5cx_must-gather-n6jcq_2578360e-4830-4223-b07f-031c6c2df11e/gather/0.log" Feb 20 10:17:39 crc kubenswrapper[5094]: I0220 10:17:39.847656 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:39 crc kubenswrapper[5094]: E0220 10:17:39.849047 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:44 crc kubenswrapper[5094]: I0220 10:17:44.943175 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:17:44 crc kubenswrapper[5094]: I0220 10:17:44.944376 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="copy" containerID="cri-o://ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" gracePeriod=2 Feb 20 10:17:44 crc kubenswrapper[5094]: I0220 10:17:44.957568 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-pc5cx/must-gather-n6jcq"] Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.390095 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pc5cx_must-gather-n6jcq_2578360e-4830-4223-b07f-031c6c2df11e/copy/0.log" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.390875 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.468780 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") pod \"2578360e-4830-4223-b07f-031c6c2df11e\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.469556 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") pod \"2578360e-4830-4223-b07f-031c6c2df11e\" (UID: \"2578360e-4830-4223-b07f-031c6c2df11e\") " Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.477348 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv" (OuterVolumeSpecName: "kube-api-access-c7jgv") pod "2578360e-4830-4223-b07f-031c6c2df11e" (UID: "2578360e-4830-4223-b07f-031c6c2df11e"). InnerVolumeSpecName "kube-api-access-c7jgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.574917 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7jgv\" (UniqueName: \"kubernetes.io/projected/2578360e-4830-4223-b07f-031c6c2df11e-kube-api-access-c7jgv\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.815794 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2578360e-4830-4223-b07f-031c6c2df11e" (UID: "2578360e-4830-4223-b07f-031c6c2df11e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.850360 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2578360e-4830-4223-b07f-031c6c2df11e" path="/var/lib/kubelet/pods/2578360e-4830-4223-b07f-031c6c2df11e/volumes" Feb 20 10:17:45 crc kubenswrapper[5094]: I0220 10:17:45.881021 5094 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2578360e-4830-4223-b07f-031c6c2df11e-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.040473 5094 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-pc5cx_must-gather-n6jcq_2578360e-4830-4223-b07f-031c6c2df11e/copy/0.log" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.040821 5094 generic.go:334] "Generic (PLEG): container finished" podID="2578360e-4830-4223-b07f-031c6c2df11e" containerID="ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" exitCode=143 Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.040864 5094 scope.go:117] "RemoveContainer" containerID="ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.040992 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pc5cx/must-gather-n6jcq" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.063997 5094 scope.go:117] "RemoveContainer" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.145565 5094 scope.go:117] "RemoveContainer" containerID="ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" Feb 20 10:17:46 crc kubenswrapper[5094]: E0220 10:17:46.146003 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147\": container with ID starting with ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147 not found: ID does not exist" containerID="ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.146239 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147"} err="failed to get container status \"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147\": rpc error: code = NotFound desc = could not find container \"ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147\": container with ID starting with ef7151eae2ca0ca37c08fec6a465f8f62ef97648b0bec7c8a5a1cfa8e04d3147 not found: ID does not exist" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.146443 5094 scope.go:117] "RemoveContainer" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" Feb 20 10:17:46 crc kubenswrapper[5094]: E0220 10:17:46.146983 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899\": container with ID starting with bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899 not found: ID does not exist" containerID="bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899" Feb 20 10:17:46 crc kubenswrapper[5094]: I0220 10:17:46.147038 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899"} err="failed to get container status \"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899\": rpc error: code = NotFound desc = could not find container \"bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899\": container with ID starting with bf28c01ded4fd08d2344ae69fe0783b98b33fb64650c719c32f78d2a847e5899 not found: ID does not exist" Feb 20 10:17:52 crc kubenswrapper[5094]: I0220 10:17:52.840539 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:17:52 crc kubenswrapper[5094]: E0220 10:17:52.842050 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.564326 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565697 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="copy" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565769 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="copy" Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565799 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="registry-server" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565907 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="registry-server" Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565929 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="extract-utilities" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565941 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="extract-utilities" Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565957 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="gather" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565965 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="gather" Feb 20 10:17:56 crc kubenswrapper[5094]: E0220 10:17:56.565988 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="extract-content" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.565996 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="extract-content" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.566292 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="gather" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.566312 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9c6363e-df27-4cdc-bdd1-26daff7ceb4b" containerName="registry-server" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.566325 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="2578360e-4830-4223-b07f-031c6c2df11e" containerName="copy" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.568496 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.580514 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.768063 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.768171 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.768224 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870315 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870406 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870516 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870939 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.870982 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:56 crc kubenswrapper[5094]: I0220 10:17:56.891591 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") pod \"community-operators-bh6s7\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:57 crc kubenswrapper[5094]: I0220 10:17:57.187057 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:17:57 crc kubenswrapper[5094]: I0220 10:17:57.633749 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:17:58 crc kubenswrapper[5094]: I0220 10:17:58.174348 5094 generic.go:334] "Generic (PLEG): container finished" podID="75023528-13e3-4ab0-927d-6edfa21f1627" containerID="c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03" exitCode=0 Feb 20 10:17:58 crc kubenswrapper[5094]: I0220 10:17:58.174417 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerDied","Data":"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03"} Feb 20 10:17:58 crc kubenswrapper[5094]: I0220 10:17:58.174811 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerStarted","Data":"3f1c87259bd06d4e00943cc87d16195aae2ed1cfd874fe2cad4388dd9b173469"} Feb 20 10:17:59 crc kubenswrapper[5094]: I0220 10:17:59.187992 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerStarted","Data":"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc"} Feb 20 10:18:01 crc kubenswrapper[5094]: I0220 10:18:01.209839 5094 generic.go:334] "Generic (PLEG): container finished" podID="75023528-13e3-4ab0-927d-6edfa21f1627" containerID="836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc" exitCode=0 Feb 20 10:18:01 crc kubenswrapper[5094]: I0220 10:18:01.209939 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerDied","Data":"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc"} Feb 20 10:18:02 crc kubenswrapper[5094]: I0220 10:18:02.220450 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerStarted","Data":"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67"} Feb 20 10:18:02 crc kubenswrapper[5094]: I0220 10:18:02.251845 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bh6s7" podStartSLOduration=2.819304675 podStartE2EDuration="6.251819088s" podCreationTimestamp="2026-02-20 10:17:56 +0000 UTC" firstStartedPulling="2026-02-20 10:17:58.176925777 +0000 UTC m=+12693.049552498" lastFinishedPulling="2026-02-20 10:18:01.60944021 +0000 UTC m=+12696.482066911" observedRunningTime="2026-02-20 10:18:02.24115547 +0000 UTC m=+12697.113782191" watchObservedRunningTime="2026-02-20 10:18:02.251819088 +0000 UTC m=+12697.124445829" Feb 20 10:18:03 crc kubenswrapper[5094]: I0220 10:18:03.842483 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:03 crc kubenswrapper[5094]: E0220 10:18:03.843084 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.187946 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.188773 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.255146 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.357756 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:07 crc kubenswrapper[5094]: I0220 10:18:07.506016 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:18:09 crc kubenswrapper[5094]: I0220 10:18:09.331448 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bh6s7" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="registry-server" containerID="cri-o://a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" gracePeriod=2 Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:09.901186 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.352908 5094 generic.go:334] "Generic (PLEG): container finished" podID="75023528-13e3-4ab0-927d-6edfa21f1627" containerID="a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" exitCode=0 Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.352958 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerDied","Data":"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67"} Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.352990 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bh6s7" event={"ID":"75023528-13e3-4ab0-927d-6edfa21f1627","Type":"ContainerDied","Data":"3f1c87259bd06d4e00943cc87d16195aae2ed1cfd874fe2cad4388dd9b173469"} Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.353014 5094 scope.go:117] "RemoveContainer" containerID="a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.353204 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bh6s7" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.367161 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") pod \"75023528-13e3-4ab0-927d-6edfa21f1627\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.367206 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") pod \"75023528-13e3-4ab0-927d-6edfa21f1627\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.367348 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") pod \"75023528-13e3-4ab0-927d-6edfa21f1627\" (UID: \"75023528-13e3-4ab0-927d-6edfa21f1627\") " Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.368995 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities" (OuterVolumeSpecName: "utilities") pod "75023528-13e3-4ab0-927d-6edfa21f1627" (UID: "75023528-13e3-4ab0-927d-6edfa21f1627"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.375902 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc" (OuterVolumeSpecName: "kube-api-access-gzxhc") pod "75023528-13e3-4ab0-927d-6edfa21f1627" (UID: "75023528-13e3-4ab0-927d-6edfa21f1627"). InnerVolumeSpecName "kube-api-access-gzxhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.385857 5094 scope.go:117] "RemoveContainer" containerID="836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.431229 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75023528-13e3-4ab0-927d-6edfa21f1627" (UID: "75023528-13e3-4ab0-927d-6edfa21f1627"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.434522 5094 scope.go:117] "RemoveContainer" containerID="c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.470385 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.470429 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzxhc\" (UniqueName: \"kubernetes.io/projected/75023528-13e3-4ab0-927d-6edfa21f1627-kube-api-access-gzxhc\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.470447 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75023528-13e3-4ab0-927d-6edfa21f1627-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.477868 5094 scope.go:117] "RemoveContainer" containerID="a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" Feb 20 10:18:10 crc kubenswrapper[5094]: E0220 10:18:10.478917 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67\": container with ID starting with a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67 not found: ID does not exist" containerID="a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.478961 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67"} err="failed to get container status \"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67\": rpc error: code = NotFound desc = could not find container \"a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67\": container with ID starting with a302024900b4d1fdf70f6801e5bd17191f9277e41d06cf81d7245ca1f75ede67 not found: ID does not exist" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.478989 5094 scope.go:117] "RemoveContainer" containerID="836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc" Feb 20 10:18:10 crc kubenswrapper[5094]: E0220 10:18:10.479482 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc\": container with ID starting with 836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc not found: ID does not exist" containerID="836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.479507 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc"} err="failed to get container status \"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc\": rpc error: code = NotFound desc = could not find container \"836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc\": container with ID starting with 836a50f935878b83d764a7d3383ae88515e36fc1e7f27766ef2da4df4873e1fc not found: ID does not exist" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.479523 5094 scope.go:117] "RemoveContainer" containerID="c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03" Feb 20 10:18:10 crc kubenswrapper[5094]: E0220 10:18:10.480029 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03\": container with ID starting with c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03 not found: ID does not exist" containerID="c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.480054 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03"} err="failed to get container status \"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03\": rpc error: code = NotFound desc = could not find container \"c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03\": container with ID starting with c5c24f4523a103b0e90866a15beab09054af9f680716c5ab49ddc347baafab03 not found: ID does not exist" Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.714486 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:18:10 crc kubenswrapper[5094]: I0220 10:18:10.731617 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bh6s7"] Feb 20 10:18:11 crc kubenswrapper[5094]: I0220 10:18:11.856574 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" path="/var/lib/kubelet/pods/75023528-13e3-4ab0-927d-6edfa21f1627/volumes" Feb 20 10:18:18 crc kubenswrapper[5094]: I0220 10:18:18.840218 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:18 crc kubenswrapper[5094]: E0220 10:18:18.840788 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.226460 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:30 crc kubenswrapper[5094]: E0220 10:18:30.227855 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="extract-utilities" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.227878 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="extract-utilities" Feb 20 10:18:30 crc kubenswrapper[5094]: E0220 10:18:30.227903 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="extract-content" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.227916 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="extract-content" Feb 20 10:18:30 crc kubenswrapper[5094]: E0220 10:18:30.227945 5094 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="registry-server" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.227959 5094 state_mem.go:107] "Deleted CPUSet assignment" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="registry-server" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.228324 5094 memory_manager.go:354] "RemoveStaleState removing state" podUID="75023528-13e3-4ab0-927d-6edfa21f1627" containerName="registry-server" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.230817 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.270344 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.351127 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.351283 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.351369 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.453001 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.453087 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.453153 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.453849 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.454068 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.490021 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") pod \"redhat-operators-559qt\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:30 crc kubenswrapper[5094]: I0220 10:18:30.567832 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:31 crc kubenswrapper[5094]: I0220 10:18:31.093037 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:31 crc kubenswrapper[5094]: I0220 10:18:31.625205 5094 generic.go:334] "Generic (PLEG): container finished" podID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerID="2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1" exitCode=0 Feb 20 10:18:31 crc kubenswrapper[5094]: I0220 10:18:31.625316 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerDied","Data":"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1"} Feb 20 10:18:31 crc kubenswrapper[5094]: I0220 10:18:31.625459 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerStarted","Data":"792549d34da7adbc68a3fe7ee378715e0ab4d58104bfcdc9915532a19f015019"} Feb 20 10:18:32 crc kubenswrapper[5094]: I0220 10:18:32.638212 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerStarted","Data":"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c"} Feb 20 10:18:32 crc kubenswrapper[5094]: I0220 10:18:32.840682 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:32 crc kubenswrapper[5094]: E0220 10:18:32.853913 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:35 crc kubenswrapper[5094]: I0220 10:18:35.685516 5094 generic.go:334] "Generic (PLEG): container finished" podID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerID="163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c" exitCode=0 Feb 20 10:18:35 crc kubenswrapper[5094]: I0220 10:18:35.685651 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerDied","Data":"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c"} Feb 20 10:18:36 crc kubenswrapper[5094]: I0220 10:18:36.697309 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerStarted","Data":"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb"} Feb 20 10:18:36 crc kubenswrapper[5094]: I0220 10:18:36.723117 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-559qt" podStartSLOduration=2.268313618 podStartE2EDuration="6.723099445s" podCreationTimestamp="2026-02-20 10:18:30 +0000 UTC" firstStartedPulling="2026-02-20 10:18:31.627392465 +0000 UTC m=+12726.500019176" lastFinishedPulling="2026-02-20 10:18:36.082178292 +0000 UTC m=+12730.954805003" observedRunningTime="2026-02-20 10:18:36.714555599 +0000 UTC m=+12731.587182310" watchObservedRunningTime="2026-02-20 10:18:36.723099445 +0000 UTC m=+12731.595726156" Feb 20 10:18:40 crc kubenswrapper[5094]: I0220 10:18:40.569518 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:40 crc kubenswrapper[5094]: I0220 10:18:40.570117 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:41 crc kubenswrapper[5094]: I0220 10:18:41.630119 5094 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-559qt" podUID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerName="registry-server" probeResult="failure" output=< Feb 20 10:18:41 crc kubenswrapper[5094]: timeout: failed to connect service ":50051" within 1s Feb 20 10:18:41 crc kubenswrapper[5094]: > Feb 20 10:18:44 crc kubenswrapper[5094]: I0220 10:18:44.840270 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:44 crc kubenswrapper[5094]: E0220 10:18:44.840979 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.193432 5094 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.198988 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.208281 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.326838 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.326915 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.326956 5094 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.429673 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.429829 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.429910 5094 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.430555 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.431535 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.476159 5094 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") pod \"redhat-marketplace-rq6hb\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:46 crc kubenswrapper[5094]: I0220 10:18:46.537926 5094 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:47 crc kubenswrapper[5094]: I0220 10:18:47.066230 5094 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:18:47 crc kubenswrapper[5094]: E0220 10:18:47.510918 5094 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefa329d_6728_4dbb_aa04_ffe8a237e397.slice/crio-conmon-19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade.scope\": RecentStats: unable to find data in memory cache]" Feb 20 10:18:47 crc kubenswrapper[5094]: I0220 10:18:47.825833 5094 generic.go:334] "Generic (PLEG): container finished" podID="defa329d-6728-4dbb-aa04-ffe8a237e397" containerID="19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade" exitCode=0 Feb 20 10:18:47 crc kubenswrapper[5094]: I0220 10:18:47.825890 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerDied","Data":"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade"} Feb 20 10:18:47 crc kubenswrapper[5094]: I0220 10:18:47.826065 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerStarted","Data":"ffe9c58e2491aee9243375a1ad768e3748148129ed1b476a7a951f6ce3a1544a"} Feb 20 10:18:48 crc kubenswrapper[5094]: I0220 10:18:48.835954 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerStarted","Data":"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c"} Feb 20 10:18:49 crc kubenswrapper[5094]: I0220 10:18:49.855023 5094 generic.go:334] "Generic (PLEG): container finished" podID="defa329d-6728-4dbb-aa04-ffe8a237e397" containerID="168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c" exitCode=0 Feb 20 10:18:49 crc kubenswrapper[5094]: I0220 10:18:49.860452 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerDied","Data":"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c"} Feb 20 10:18:50 crc kubenswrapper[5094]: I0220 10:18:50.650780 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:50 crc kubenswrapper[5094]: I0220 10:18:50.711357 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:50 crc kubenswrapper[5094]: I0220 10:18:50.867430 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerStarted","Data":"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29"} Feb 20 10:18:50 crc kubenswrapper[5094]: I0220 10:18:50.896385 5094 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rq6hb" podStartSLOduration=2.25101946 podStartE2EDuration="4.896361592s" podCreationTimestamp="2026-02-20 10:18:46 +0000 UTC" firstStartedPulling="2026-02-20 10:18:47.828585638 +0000 UTC m=+12742.701212389" lastFinishedPulling="2026-02-20 10:18:50.47392777 +0000 UTC m=+12745.346554521" observedRunningTime="2026-02-20 10:18:50.884825203 +0000 UTC m=+12745.757451944" watchObservedRunningTime="2026-02-20 10:18:50.896361592 +0000 UTC m=+12745.768988323" Feb 20 10:18:52 crc kubenswrapper[5094]: I0220 10:18:52.966474 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:52 crc kubenswrapper[5094]: I0220 10:18:52.967075 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-559qt" podUID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerName="registry-server" containerID="cri-o://bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" gracePeriod=2 Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.463908 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.524485 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") pod \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.524539 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") pod \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.524572 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") pod \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\" (UID: \"c2fb1f9f-aa57-4dff-9a98-23600dabc73c\") " Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.526210 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities" (OuterVolumeSpecName: "utilities") pod "c2fb1f9f-aa57-4dff-9a98-23600dabc73c" (UID: "c2fb1f9f-aa57-4dff-9a98-23600dabc73c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.531033 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld" (OuterVolumeSpecName: "kube-api-access-6dvld") pod "c2fb1f9f-aa57-4dff-9a98-23600dabc73c" (UID: "c2fb1f9f-aa57-4dff-9a98-23600dabc73c"). InnerVolumeSpecName "kube-api-access-6dvld". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.626876 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.626943 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dvld\" (UniqueName: \"kubernetes.io/projected/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-kube-api-access-6dvld\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.655314 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c2fb1f9f-aa57-4dff-9a98-23600dabc73c" (UID: "c2fb1f9f-aa57-4dff-9a98-23600dabc73c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.728508 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2fb1f9f-aa57-4dff-9a98-23600dabc73c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902817 5094 generic.go:334] "Generic (PLEG): container finished" podID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" containerID="bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" exitCode=0 Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902863 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerDied","Data":"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb"} Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902894 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-559qt" event={"ID":"c2fb1f9f-aa57-4dff-9a98-23600dabc73c","Type":"ContainerDied","Data":"792549d34da7adbc68a3fe7ee378715e0ab4d58104bfcdc9915532a19f015019"} Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902915 5094 scope.go:117] "RemoveContainer" containerID="bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.902932 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-559qt" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.933238 5094 scope.go:117] "RemoveContainer" containerID="163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c" Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.933405 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.952087 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-559qt"] Feb 20 10:18:53 crc kubenswrapper[5094]: I0220 10:18:53.958124 5094 scope.go:117] "RemoveContainer" containerID="2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.015012 5094 scope.go:117] "RemoveContainer" containerID="bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" Feb 20 10:18:54 crc kubenswrapper[5094]: E0220 10:18:54.019039 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb\": container with ID starting with bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb not found: ID does not exist" containerID="bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.019076 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb"} err="failed to get container status \"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb\": rpc error: code = NotFound desc = could not find container \"bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb\": container with ID starting with bfc7749518aecf9fd3e7369ac9bfaa846cc4bd61d323f9bdeaa1bc47fca643bb not found: ID does not exist" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.019099 5094 scope.go:117] "RemoveContainer" containerID="163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c" Feb 20 10:18:54 crc kubenswrapper[5094]: E0220 10:18:54.019352 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c\": container with ID starting with 163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c not found: ID does not exist" containerID="163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.019376 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c"} err="failed to get container status \"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c\": rpc error: code = NotFound desc = could not find container \"163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c\": container with ID starting with 163f70b4ec0cb4c95815a1eb6a48e16ab9102a581b8d64b2ea287805b8b1604c not found: ID does not exist" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.019389 5094 scope.go:117] "RemoveContainer" containerID="2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1" Feb 20 10:18:54 crc kubenswrapper[5094]: E0220 10:18:54.020033 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1\": container with ID starting with 2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1 not found: ID does not exist" containerID="2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1" Feb 20 10:18:54 crc kubenswrapper[5094]: I0220 10:18:54.020103 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1"} err="failed to get container status \"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1\": rpc error: code = NotFound desc = could not find container \"2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1\": container with ID starting with 2d504c558c63939b83a5299e910ff68b89b7a4bc3934c694efd127d7dd6239b1 not found: ID does not exist" Feb 20 10:18:55 crc kubenswrapper[5094]: I0220 10:18:55.857581 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2fb1f9f-aa57-4dff-9a98-23600dabc73c" path="/var/lib/kubelet/pods/c2fb1f9f-aa57-4dff-9a98-23600dabc73c/volumes" Feb 20 10:18:56 crc kubenswrapper[5094]: I0220 10:18:56.538422 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:56 crc kubenswrapper[5094]: I0220 10:18:56.538806 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:56 crc kubenswrapper[5094]: I0220 10:18:56.600201 5094 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:57 crc kubenswrapper[5094]: I0220 10:18:57.009140 5094 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:57 crc kubenswrapper[5094]: I0220 10:18:57.766914 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:18:58 crc kubenswrapper[5094]: I0220 10:18:58.965397 5094 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rq6hb" podUID="defa329d-6728-4dbb-aa04-ffe8a237e397" containerName="registry-server" containerID="cri-o://73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" gracePeriod=2 Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.440688 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.581635 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") pod \"defa329d-6728-4dbb-aa04-ffe8a237e397\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.581869 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") pod \"defa329d-6728-4dbb-aa04-ffe8a237e397\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.582049 5094 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") pod \"defa329d-6728-4dbb-aa04-ffe8a237e397\" (UID: \"defa329d-6728-4dbb-aa04-ffe8a237e397\") " Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.582672 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities" (OuterVolumeSpecName: "utilities") pod "defa329d-6728-4dbb-aa04-ffe8a237e397" (UID: "defa329d-6728-4dbb-aa04-ffe8a237e397"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.587888 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq" (OuterVolumeSpecName: "kube-api-access-5sprq") pod "defa329d-6728-4dbb-aa04-ffe8a237e397" (UID: "defa329d-6728-4dbb-aa04-ffe8a237e397"). InnerVolumeSpecName "kube-api-access-5sprq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.629069 5094 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "defa329d-6728-4dbb-aa04-ffe8a237e397" (UID: "defa329d-6728-4dbb-aa04-ffe8a237e397"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.684484 5094 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-utilities\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.684526 5094 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sprq\" (UniqueName: \"kubernetes.io/projected/defa329d-6728-4dbb-aa04-ffe8a237e397-kube-api-access-5sprq\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.684537 5094 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/defa329d-6728-4dbb-aa04-ffe8a237e397-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.840369 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:18:59 crc kubenswrapper[5094]: E0220 10:18:59.840695 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982766 5094 generic.go:334] "Generic (PLEG): container finished" podID="defa329d-6728-4dbb-aa04-ffe8a237e397" containerID="73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" exitCode=0 Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982808 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerDied","Data":"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29"} Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982838 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rq6hb" event={"ID":"defa329d-6728-4dbb-aa04-ffe8a237e397","Type":"ContainerDied","Data":"ffe9c58e2491aee9243375a1ad768e3748148129ed1b476a7a951f6ce3a1544a"} Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982865 5094 scope.go:117] "RemoveContainer" containerID="73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" Feb 20 10:18:59 crc kubenswrapper[5094]: I0220 10:18:59.982874 5094 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rq6hb" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.007295 5094 scope.go:117] "RemoveContainer" containerID="168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.041581 5094 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.043030 5094 scope.go:117] "RemoveContainer" containerID="19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.053244 5094 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rq6hb"] Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.090626 5094 scope.go:117] "RemoveContainer" containerID="73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" Feb 20 10:19:00 crc kubenswrapper[5094]: E0220 10:19:00.091321 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29\": container with ID starting with 73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29 not found: ID does not exist" containerID="73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.091376 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29"} err="failed to get container status \"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29\": rpc error: code = NotFound desc = could not find container \"73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29\": container with ID starting with 73e74d894b863d09abbd8119e2eec5b41d5af92d6f206548e14a84510e46ad29 not found: ID does not exist" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.091409 5094 scope.go:117] "RemoveContainer" containerID="168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c" Feb 20 10:19:00 crc kubenswrapper[5094]: E0220 10:19:00.092002 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c\": container with ID starting with 168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c not found: ID does not exist" containerID="168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.092121 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c"} err="failed to get container status \"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c\": rpc error: code = NotFound desc = could not find container \"168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c\": container with ID starting with 168ea3d168f84a5ed368c3173ac95ee8c5db2344dfbc556a9733949fed085a8c not found: ID does not exist" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.092205 5094 scope.go:117] "RemoveContainer" containerID="19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade" Feb 20 10:19:00 crc kubenswrapper[5094]: E0220 10:19:00.092544 5094 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade\": container with ID starting with 19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade not found: ID does not exist" containerID="19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade" Feb 20 10:19:00 crc kubenswrapper[5094]: I0220 10:19:00.092584 5094 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade"} err="failed to get container status \"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade\": rpc error: code = NotFound desc = could not find container \"19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade\": container with ID starting with 19d66568e0a06c42b894d221af96c193ff21978e60895d76054875aac01d8ade not found: ID does not exist" Feb 20 10:19:01 crc kubenswrapper[5094]: I0220 10:19:01.855021 5094 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defa329d-6728-4dbb-aa04-ffe8a237e397" path="/var/lib/kubelet/pods/defa329d-6728-4dbb-aa04-ffe8a237e397/volumes" Feb 20 10:19:11 crc kubenswrapper[5094]: I0220 10:19:11.840092 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:11 crc kubenswrapper[5094]: E0220 10:19:11.841252 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:19:22 crc kubenswrapper[5094]: I0220 10:19:22.841191 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:22 crc kubenswrapper[5094]: E0220 10:19:22.842427 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:19:34 crc kubenswrapper[5094]: I0220 10:19:34.840194 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:34 crc kubenswrapper[5094]: E0220 10:19:34.841267 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:19:47 crc kubenswrapper[5094]: I0220 10:19:47.841661 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:47 crc kubenswrapper[5094]: E0220 10:19:47.843017 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:19:59 crc kubenswrapper[5094]: I0220 10:19:59.840226 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:19:59 crc kubenswrapper[5094]: E0220 10:19:59.841037 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:20:10 crc kubenswrapper[5094]: I0220 10:20:10.841209 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:20:10 crc kubenswrapper[5094]: E0220 10:20:10.842758 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:20:21 crc kubenswrapper[5094]: I0220 10:20:21.841664 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:20:21 crc kubenswrapper[5094]: E0220 10:20:21.842973 5094 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-56ppq_openshift-machine-config-operator(0810cb2f-5b29-4c97-8b16-e1bb2d455a0d)\"" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" podUID="0810cb2f-5b29-4c97-8b16-e1bb2d455a0d" Feb 20 10:20:35 crc kubenswrapper[5094]: I0220 10:20:35.857601 5094 scope.go:117] "RemoveContainer" containerID="c927037245d316055767cceaf6984c10d64e9a925b37958198ebdc69149ce512" Feb 20 10:20:36 crc kubenswrapper[5094]: I0220 10:20:36.173777 5094 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-56ppq" event={"ID":"0810cb2f-5b29-4c97-8b16-e1bb2d455a0d","Type":"ContainerStarted","Data":"e494f0f6155c48fe39960089a5835d5140b6d9af1d65647b4a7985a7770671bf"}